• PROTECT AND ENHANCE THE VALUE OF YOUR ENTERPRISE

    FREE CUSTOMIZED DEMO

    The Continuity Logic customized demo provides an opportunity for qualifying organizations to evaluate Frontline Live 5™, with their plans, desired controls, policies, and procedures. This first-of-its-kind system for both business continuity and many other areas of Governance, Operational Risk and Compliance (GRC) is powerful, but often best viewed with some of your familiar plans, data and templates.

    LEARN MORE ABOUT FRONTLINE LIVE 5

Fall World 2015

Conference & Exhibit

Attend The #1 BC/DR Event!

Summer Journal

Volume 28, Issue 3

Full Contents Now Available!

Jon Seals

Study exposes a lack of readiness for EU data laws, shows organisations are struggling to enforce acceptable usage policies and reveals the activity of Europe's most ‘dangerous' cloud user

LONDON – Skyhigh Networks, the Cloud Visibility and Enablement company, today released its latest quarterly European Cloud Adoption and Risk Report. The report analyses real-life usage data from 1.6 million European users.

In Europe, the number of cloud services in use by the average company increased 23 percent, rising from 588 in Q1 to 724 in Q3. However, not all of these services are ready for the enterprise. Developed in conjunction with the Cloud Security Alliance, Skyhigh's Cloud Trust Program tracks the attributes of cloud services and ranks them according to risk. The report found that only 9.5 percent of all services meet the most stringent security requirements including strong password policies and data encryption.

The report also reveals a worrying lack of conformance to the EU Data Protection Directive, particularly with regards to the transfer of personally identifiable information outside Europe. Skyhigh found that nearly three quarters (74.3 percent) of the cloud services used by European organisations do not meet the requirements of the current privacy regulations, with data being sent to countries without adequate levels of data protection.  With stricter policies and harsher penalties set to come into force soon, organisations have just a short window to address these issues.

"The growth in cloud services being used in Europe is testament to the benefits users see in the services on offer," said Rajiv Gupta, CEO, Skyhigh Networks. "On the other hand, the IT department needs to make sure that these services don't put the organisation's intellectual property at risk.  This report analyses real-world cloud usage data to shine a light on the extent of Shadow IT."

Echoing the last report, much of the adoption of cloud services still remains under the radar of IT departments with 76 percent of IT professionals not knowing the scope of Shadow IT at their companies but wanting to know.  As such, a key problem that IT teams face is the enforcement of an acceptable use policy.  The report found that IT personnel are often surprised when it is discovered that cloud services that they believe to have been blocked are actually being used by employees. As part of the study, Skyhigh surveyed IT professionals to understand their expected block rates for certain cloud services, and then compared this to actual block rates measured in the wild. The resulting ‘cloud enforcement gap' was surprising, for example 44 percent of IT professionals intended to block YouTube, but only 1 percent of organisations blocked the service comprehensively.

In terms of trends, the report found that 80 percent of all corporate data uploaded to the cloud is sent to just 15 percent of cloud services, which makes it easier for IT teams to prioritise security and risk analysis. The top destination for corporate data in Europe is Microsoft Office 365, followed by Salesforce. However, there's a long tail of services below these top 15 and this is where 73 percent of the compromised accounts, insider threats and malware originate.

"The gap between perception and reality uncovered by this study is worrying, as so much corporate data is being uploaded to cloud services that IT teams believe they have blocked," continued Gupta. "It only takes one misstep to cause a serious security or compliance threat to an organisation. As such, mechanisms should be in place not only to discover which cloud services are being used, but also to analyse the risk profile of these services and understand the true implications for enterprise data security."

Finally, by digging deeper into the statistics, the report has for the first time revealed the behaviour of the most ‘dangerous' cloud user in Europe. This person uploaded greater than 17.5GB of data to 71 high-risk cloud services in a three month period, the equivalent of 8,750 copies of War and Peace. Some of these high-risk services are also used to distribute malware into organisations. This highlights the threat a single user could pose to an organisation and its data.

The full report is available here: www.skyhighnetworks.com/cloud-report

The idea of data as philanthropy received a Silicon Valley boost this week when Informatica and Cloudera announced plans to support the non-profit, DataKind. Both Informatica, which specializes in data integration, and Cloudera, a Hadoop analytics company, will jointly sponsor DataKind programs and projects.

DataKind applies data science to world problems by making data scientists available to work with governments and other mission-organizations that are working on issues such as education, vaccine delivery and poverty eradication. For example, Bayes Impact created a model that would help reduce fraud while maximizing loans to honest people for micro-financier, Zidisha.

Big Data has a long track-record of social justice work. For instance, last year, ITBE’s Don Tennant wrote about Big Data’s use in the fight against human trafficking. Earlier this year, civic technologist Matt Stempeck proposed businesses make data donations to non-profits, which prompted my earlier post about the business value of data philanthropy.

...

http://www.itbusinessedge.com/blogs/integration/emerging-trend-data-scientist-as-humanitarian-worker.html

As people increasingly turn to social media after a disaster — both to get information and check to see if their friends and family have been affected — the platforms are creating disaster-specific tools. Twitter Alerts, for example, was launched in September 2013 as a way to highlight emergency information from vetted agencies across the social networking platform. And now Facebook has joined the movement with a new tool, called Safety Check, that’s designed to be an easy way for users to let their friends and family members know if they’re OK after a disaster.

Introduced via a blog post on Oct. 15, the company says that in addition to helping users let others know if they’re safe, Safety Check also allows users to check on people in the affected area and mark friends as safe. The feature works on Facebook’s desktop and mobile applications, including Android and iOS.

When users are within the vicinity of an area affected by a disaster, they will receive a notification from Facebook asking if they’re safe. Selecting “I’m Safe” will post an update on that user’s Facebook page.

...

http://www.emergencymgmt.com/disaster/Facebook-Safety-Check-Feature-Disasters.html

(MCT) — California is on track to deliver, within two years, an earthquake early warning system that can give 10 seconds to a minute or more warning that a major earthquake is about to hit, officials said Thursday.

The development of such a system would enable gas and electric utilities, railroad operators, crane operators and people time to take evasive action, said Sen. Alex Padilla, D-Pacoima. His Senate Bill 135 mandated that an early-warning system be developed.

The bill, which went into effect in January, required the state Office of Emergency Services to develop a statewide earthquake early warning system to alert Californians in advance of dangerous shaking.

The initial cost to build and operate the system for five years is $80 million.

On Thursday, Padilla said that state Office of Emergency Services officials have told him the system is on track to be operational by January 2016.

...

http://www.emergencymgmt.com/disaster/California-Officials-Target-2016-Earthquake-Early-Warning.html

Winter storms caused $1.9 billion in insured losses in 2013, five times higher than the $38 million in damages seen in 2012, so it’s good to read via NOAA’s U.S. Winter Outlook that a repeat of last year’s winter of record cold and snow is unlikely.

In a release, NOAA’s Climate Prediction Center says:

Last year’s winter was exceptionally cold and snowy across most of the United States, east of the Rockies. A repeat of this extreme pattern is unlikely this year, although the Outlook does favor below-average temperatures in the south-central and southeastern states.”

While the South may experience a colder winter, the Outlook favors warmer-than-average temperatures in the western U.S., Alaska, Hawaii and New England, according to NOAA.

...

http://www.iii.org/insuranceindustryblog/?p=3820

Over the past 40 years, tidal flooding has quadrupled in many low-lying areas, but that change is accelerating due to sea level rising. According to a new study, even moderate rising could as much as triple coastal flooding events in many communities in the next 15 years. Based on even moderate projections for sea level rise from the 2014 National Climate Assessment, the Union of Concerned Scientists’ study “Encroaching Tides” calls attention to the threat of routine tidal flooding to much of the East and Gulf Coasts. As opposed to storm surges, tidal flooding occurs far more regularly, bringing water above the base sea level during routine tide patterns or, for example, twice a month due to the moon’s increased gravitational pull.

With anticipated sea level rise, even daily tides may flood many areas, according to the report. As the base sea level changes, deviations take on new meanings–which can have drastic implications for property.

...

http://www.riskmanagementmonitor.com/east-coast-tidal-flooding-could-triple-by-2030

Enterprise and colocation data centers now able to parallel up to six Liebert eXL UPS units to provide up to 4.15 MW of redundant capacity

COLUMBUS, Ohio – Emerson Network Power, a business of Emerson (NYSE:EMR) and a global leader in maximizing availability, capacity and efficiency of critical infrastructure, has extended the capacity of the Liebert eXL family of large, transformer-free uninterruptible power supply (UPS) systems with today’s introduction of the Liebert® eXL 1200 kVA/kW UPS, as well as upgrades to the previously released 800 kVA/kW model. 

Ideal for colocation facilities and large enterprise data centers, the Liebert eXL UPS 1200 kVA is available in single module and 1+N (distributed bypass) multi-module systems, providing a choice of applications to best meet business needs. The Liebert UPS helps data centers maximize operational efficiency and increase PUE by delivering efficiency up to 97 percent in double conversion mode, with a potential savings of up to $12,000 per year over competitive technologies. Flexible configurations and capacity-on-demand with Softscale technology help data center managers conserve capital by enabling them to limit their initial investment, while allowing them to rapidly expand capacity.

“Large enterprise data centers and colocation facilities are looking for the most efficient UPS systems that can help them maximize efficiency, reduce operating expenses, improve PUE, achieve service level agreements and cost-effectively prepare for unpredictable future power requirements,” said David Sonner, vice president, marketing for Emerson Network Power in North America. “The Liebert eXL UPS enables today’s dynamic data centers to be rapidly deployed, flexible and easily scalable. It does this by offering high operating efficiency, smaller footprint, higher power density, optimal power delivery, scalable architecture and lower installation cost.”

Softscale technology allows the Liebert eXL UPS to provide capacity-on-demand,  initially sized to current power requirements and easily scaled to a larger capacity without adding footprint. The Liebert UPS can also parallel up to six units to increase capacity and redundancy. The unity power factor rating of the Liebert eXL UPS enables it to provide more usable power to address computing demand for modern IT loads, thereby eliminating the need to oversize the UPS. The larger capacity rating means that systems can be designed with fewer modules, reducing cost, complexity and possible points of failure.

The Liebert eXL UPS includes an integrated 10.4 inch color human machine interface (HMI) touchscreen which enables operators to see unit status at a glance, and provides detailed information making it easy to understand, diagnose and control the UPS. The system is enabled with Emerson’s LIFE Services with remote diagnostic and real-time communication, which allows for proactive maintenance and a condensed service cycle. This results in a decreased Mean Time to Repair (MTTR) and an improved Mean Time Between Failures (MTBF). It also has out-of-the-box compatibility with Emerson Network Power’s data center infrastructure management (DCIM) system, the Trellis platform, and is compatible with the Liebert Nform and Liebert SiteScan monitoring and reporting systems.

Since UPS battery failures continue to be the primary cause of unplanned outages, the  Liebert eXL UPS battery cabinets are available with factory installed and tested Alber® BDSUi battery monitoring systems, which continuously monitor battery health in order to provide advance warning of a pending battery failure, as well as enable maintenance and replacement based on the condition of the batteries rather than arbitrarily timed schedules.

The Liebert eXL UPS joins the previously introduced 800 kVA model and will be available in North America, Central America and South America for 480V, 60 Hz applications. The Liebert eXL UPS 1200 kVA SMS, 800 kVA 1+N,  and all Softscale models will be available to ship in January 2015, and the 1200 kVA 1+N system will be available to ship in March 2015.

For more information on the Liebert eXL UPS, or other technologies and services from Emerson Network Power, visit www.Liebert.com.

Following the release of Insignia Communications’s latest report ‘The effect of social media on breaking news’, managing director, Jonathan Hemus, discusses what the findings mean for business continuity managers.

By Jonathan Hemus

With the increased use of social media and ‘citizen journalism’, people are creating and sharing more information than ever before. It is now far easier (and quicker) for disgruntled employees, unhappy customers and campaigners to voice their opinions online – providing a wealth of content for journalists in a crisis.

A perfect example of this affected Apple just last month. Two days after the iPhone 6 went on sale on 19th September, images surfaced on social media showing phones which appeared to have bent in people’s pockets as a result of accidental pressure. Within hours, the pictures had spread like wildfire on Twitter with thousands of people posting comments using the hashtags Bentgate and Bendgate: an unwanted headache for Apple and further proof of the speed at which social media can propel an issue into the spotlight.

...

http://www.continuitycentral.com/feature1238.html

One of the reasons I enjoy writing about technology, particularly data technology, is because I believe it can illuminate real-world problems. So you can imagine my frustration when I tried to fact-check the conflicting data on Ebola’s infection rates. One article claimed a case fatality rate of 25 percent, while another cited 90 percent.

I checked and, surprisingly, both are right — well, sort of. WHO states:

"The average EVD case fatality rate is around 50%. Case fatality rates have varied from 25% to 90% in past outbreaks.”

This week, WHO bumped that fatality rate to 70 percent.

The reason the numbers range so widely is simple: West African health care systems and reporting structures aren’t advanced enough to properly track it, according to the CDC.

That’s a rational explanation, but it doesn’t resolve the confusion. Surely, if we’re serious about stopping the spread of Ebola and finding a cure, we’re going to need real data.

...

http://www.itbusinessedge.com/blogs/integration/where-is-the-data-in-the-fight-against-ebola.html

It took home improvement retailing giant Home Depot about a week before it finally confirmed it had suffered a data breach. Home Depot first reported the possibility of a breach on 2 September 2014, but did not actually confirm the hacking until 8 September. During that time, the company made somewhat vague statements that it was still carrying out an investigation to determine whether or not its systems had actually been compromised.

Based on the company’s recent press release confirming the breach (see “The Home Depot Provides Update on Breach Investigation“), it appears that Home Depot’s internal IT security team was unaware that its payment data systems had been compromised. Instead, it looks as if the company only caught on to the breach, and then launched its investigation, once it had received reports from banking partners and law enforcement officials notifying the company of suspicious activity with payment cards used at the retailer’s various stores. (This is a trend we are seeing more often, and it is disturbing because it signals that the malware used to infect store POS systems is very difficult to detect.) The company believes the breach took place initially sometime in April 2014. No information regarding the size of the breach was included in the press release.

...

http://blog.cutter.com/2014/10/16/developing-an-incident-response-plan-for-data-breaches/