Spring World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Fall Journal

Volume 29, Issue 4

Full Contents Now Available!

Industry Hot News

Industry Hot News (6688)

Wednesday, 11 February 2015 00:00

Big Data and the Mirror of Erised

“This mirror will give us neither knowledge or truth.”

So says Dumbledore in J.K. Rowling’s book, Harry Potter and the Sorcerer’s Stone, commenting on a mirror that shows us what our most desperate desires want us to see.

This is an apt analogy when describing the analytics available in big data solutions. When you suddenly have all the data you could want and can quickly analyze it anyway you like, unencumbered by extraneous effort that we have historically had to endure, what happens? Being human beings with a tendency to confirm what we so want to have happen or to relive what felt so good in the past, managers often drift into self-sealing and circular analysis that at first doesn’t seem so wrong. Big data has to poke through the subtle and instinctual responses of data denial.



NEW ORLEANS—At the first day of the International Disaster Conference and Expo (IDCE), one of the primary topics of areas of concern for attendees and speakers alike was the risk of pandemics and infectious diseases. In a plenary session titled “Contagious Epidemic Responses: Lessons Learned,” Dr. Clinton Lacy, director of the Institute for Emergency Preparedness and Homeland Security at Rutgers, focused on the recent and ongoing Ebola outbreak.

While only four people in the United States were diagnosed with Ebola, three of whom survived what was previously considered a death sentence, government and health officials cannot afford to ignore the crisis, Lacy warned.

“This outbreak is not just a cautionary tale, it is a warning,” Lacy said. “Ebola is our public health wakeup call.”

A slow start by the Centers for Disease Control, inadequate protective gear in healthcare facilities, and inadequate planning for screening quarantine and waste management were some of the key failings in national preparedness for Ebola. And all were clearly preventable. A significant amount has been done to improve preparedness, Lacy said, but there is still a significant amount yet to do as well.



(TNS) — Commissioners and emergency officials in Pennsylvania are calling for reform for what they say is an outdated emergency telephone services law.

The law, enacted in 1990, doesn’t sufficiently address cellphones and other wireless devices and is adversely affecting funding for 911 systems, they say.

“This is the top priority for the (County Commissioners Association of Pennsylvania) this year,” Somerset County Commissioner Pam Tokar-Ickes said.

Tokar-Ickes also is a directors board member of the statewide organization.

“Since 1990, there have been significant changes because of technology — a lot more people using wireless devices — and the legislation is a piecemeal collection.”



(TNS) — When Paul Allen picks a cause, he usually takes his time.

The Microsoft co-founder likes to convene brainstorming sessions, consult experts and recruit advisers before making major philanthropic gifts.

But when Ebola flared in West Africa last summer, Allen was among the first private donors to step up. As the toll from the disease soared, he quickly raised his commitment to $100 million — the largest from any individual and double the amount contributed by the Bill & Melinda Gates Foundation.

Now that the epidemic seems to be slowing, Allen is still moving fast.



(TNS) — What would you do with a few seconds or minutes of warning before an earthquake strikes?

When late-night comedian Conan O’Brien considered the question recently, the result was a laugh-out-loud segment with people stampeding into walls, snapping risqué selfies or cranking up the boom box for one last dance.

A more sober — and useful — range of options will be on the table next week, when a small group of businesses and agencies embark on the Northwest’s first public test of a prototype earthquake early warning system.

“Up until now, we’ve been running it and watching the results in-house only,” said John Vidale, director of the Pacific Northwest Seismic Network at the University of Washington.



Enterprise apps are a hot item. I wrote a recent feature that cited research from appFigures, Kinvey and Frost & Sullivan that, in a variety of ways, pointed to the growth in interest on the parts of both developers and their clients.

QuinStreet Enterprise, which publishes IT Business Edge, has released survey research that reveals an important finding: The user interface (UI) and related ease-of-use features are very high (if not at the top) of the list of important elements in the success of an enterprise app. The survey, “2015 Enterprise Applications Outlook: To SaaS or not to Saas” (free download with registration) said that the key features for enterprise users are easy implementation, smooth integration with existing technology and good security.



No matter what your stance on the cloud and its role in supporting critical vs. non-critical workloads, it should be clear by now that any data infrastructure that remains in the enterprise will be dramatically different from the sprawling, silo-based facilities of today.

Retaining key workloads in-house will likely be a priority for some, but that does not mean the data center isn’t ripe for an upgrade that improves data-handling while lowering capital and operational costs. And the strategy of choice at the moment is convergence.



(TNS) — As earthquakes continue to rattle Oklahomans after a record-setting year, state officials are trying to coordinate their responses and soothe fears.

Secretary of Energy and Environment Michael Teague said Friday his office will develop a website to help keep the public informed of various agency actions on earthquakes. He said it will be modeled after the Oklahoma Water Resources Board’s drought page, drought.ok.gov.

The state had 585 earthquakes greater than a 3.0 magnitude in 2014, up from 109 in 2013. Some studies have linked wastewater injection wells from oil and gas development to increased seismic activity.

“We recognize we have a problem,” said Teague, who heads the Governor’s Coordinating Council on Seismic Activity. “There’s something going on. But the science is not completely settled.



(TNS) — New Mexico hasn’t had its first zombie infection yet, but if that happens, Nick Generous and others on a Los Alamos National Laboratory team will probably map it on their new Biosurveillance Gateway website.

All epidemics — whether ebola, measles or zombie apocalypses — begin with patient zero.

“In the earliest stages of outbreak, there’s this critical period of time that officials can enact certain interventions to minimize and prevent the spread,” said Generous, a molecular biologist who helped develop the Biosurveillance Gateway. “So, how do you decide what to do?”

Quarantine, vaccinate or, in the case of that nasty zombie, just shoot its head off?



Telecommunications networks are huge users of energy. The cable industry, for instance, relies upon millions of servers, amplifiers and other network devices throughout vast networks. These all need to be powered. In homes, set-top boxes, gateways and other gadgets need juice, as well.

Cable and telcos, and the companies that support them, are taking steps to control this usage, at least in the home. In 2012, companies connected with the pay television industry entered a voluntary agreement to cut energy use in set-top boxes (STBs). Late last summer, D&R International, Ltd. on behalf of the group, published a report on the impact of the initiative on usage during 2013.

The report, according to Switchboard, the National Resources Defense staff blog, suggests very strongly that the agreement is having the desired effect. Energy use decreased 5 percent during the year and saved about $168 million. Usage of energy by STBs was 14 percent less than devices installed in 2012. The story points out that the next wave of voluntary requirements will increase savings to $1 billion annually when they are implemented in 2017.



Tuesday, 10 February 2015 00:00

Expect Shadow IT to Be a Long-Term Problem

Last week, CipherCloud revealed the results of a survey regarding the use of shadow IT. The study found that of the 1,100 cloud applications used in an enterprise setting, 86 percent of those are being used without authorization of the IT department.

Fellow IT Business Edge blogger Arthur Cole believes that, despite the high use of shadow IT within the workspace, the practice’s decline is inevitable. He wrote:

Now that the cloud has taken a firm hold in the enterprise, shadow IT will diminish naturally as internal resources gain the flexibility and availability that knowledge workers require. In fact, you could argue that shadow IT is a net positive for the enterprise because it creates the impetus to shed aging, silo-based infrastructure in favor of a more flexible, dynamic environment. And ultimately, this will allow many organizations to abolish their IT cost centers entirely in order to focus resources on more profitable endeavors.



Here’s the quick version. Hackers operating in the same cloud server hardware as you can steal your encryption keys and run off with your data/bank codes/customers/company (strike out items that do not apply – if any). Yes, behind that mouthful of a title is a scary prospect indeed. Until recently, this kind of cloud-side hacking possibility had been discussed but not observed. Now a team of computer scientists have managed to recover a private key used by one virtual machine by spying on it using another virtual machine. Therefore a hacker could conceivably do the same to your VM from another VM running on the same server. How worried should you be?



Friday, 06 February 2015 00:00

Federal Flood Risk Management Standard

WASHINGTON – On January 30, the President issued an Executive Order 13690, “Establishing a Federal Flood Risk Management Standard and a Process for Further Soliciting and Considering Stakeholder Input.” Prior to implementation of the Federal Flood Risk Management Standard, additional input from stakeholders is being solicited and considered on how federal agencies will implement the new Standard. To carry out this process, a draft version of Implementing Guidelines is open for comment until April 6, 2015.

Floods, the most common natural disaster, damage public health and safety, as well as economic prosperity. They can also threaten national security. Between 1980 and 2013, the United States suffered more than $260 billion in flood-related damages. With climate change and other threats, flooding risks are expected to increase over time. Sea level rise, storm surge, and heavy downpours, along with extensive development in coastal areas, increase the risk of damage due to flooding. That damage can be particularly severe for infrastructure, including buildings, roads, ports, industrial facilities and even coastal military installations.

The new Executive Order amends the existing Executive Order 11988 on Floodplain Management and adopts a higher flood standard for future federal investments in and affecting floodplains, which will be required to meet the level of resilience established in the Federal Flood Risk Management Standard. This includes projects where federal funds are used to build new structures and facilities or to rebuild those that have been damaged. These projects make sure that buildings are constructed to withstand the impacts of flooding, improves the resilience of communities, and protects federal investments.

This Standard requires agencies to consider the best available, actionable science of both current and future risk when taxpayer dollars are used to build or rebuild in floodplains. On average, more people die annually from flooding than any other natural hazard. Further, the costs borne by the federal government are more than any other hazard. Water-related disasters account for approximately 85% of all disaster declarations.

The Standard establishes the flood level to which new and rebuilt federally funded structures or facilities must be resilient. In implementing the Standard, agencies will be given the flexibility to select one of three approaches for establishing the flood elevation and hazard area they use in siting, design, and construction:

  • Utilizing best available, actionable data and methods that integrate current and future changes in flooding based on climate science;
  • Two or three feet of elevation, depending on the criticality of the building, above the 100-year, or 1%-annual-chance, flood elevation; or
  • 500-year, or 0.2%-annual-chance, flood elevation.

Prior to implementation of the Federal Flood Risk Management Standard, additional input from stakeholders is being solicited and considered. To carry out this process, FEMA, on behalf of the Mitigation Framework Leadership Group (MitFLG), published a draft version of Implementing Guidelines that is open for comment. A Federal Register Notice has been published to seek written comments, which should be submitted at www.regulations.gov under docket ID FEMA-2015-0006 for 60 days.  Questions may be submitted to FEMA-FFRMS@fema.dhs.gov.

FEMA will also be holding public meetings to further solicit stakeholder input and will also host a virtual listening session in the coming months. Notice of these meetings will be published in the Federal Register.  At the conclusion of the public comment period, the MitFLG will revise the draft Implementing Guidelines, based on input received, and provide recommendations to the Water Resources Council.

The Water Resources Council will, after considering the recommendations of the MitFLG, issue amended guidelines to provide guidance to federal agencies on the implementation of the Standard. Agencies will not issue or amend existing regulations or program procedures until the Water Resources Council issues amended guidelines that are informed by stakeholder input.

FEMA looks forward to participation and input in the process as part of the work towards reducing flood risk, increasing resilience, cutting future economic losses, and potentially saving lives.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

I was watching one of my favorite news shows late last night when the host came back from commercials with a breaking news story: Health-insurance company Anthem had been breached. The show’s host provided a couple of details of what the breach entailed; he said that it was personal information of customers and employees, their addresses, birthdates, Social Security numbers (emphasis was the host’s).

After that, I knew exactly what I was going to be waking up to this morning: an inbox filled with commentary on this latest high-profile breach and a topic right at hand for today’s blog post.

Much of that commentary applauded Anthem for its quick response to the breach, like this comment from Lee Weiner, SVP of products and engineering with Rapid7:



The Internet of Things is among the trends driving companies to invest in data virtualization, according to Suresh Chandrasekaran, senior VP for data virtualization vendor Denodo.

Data virtualization isn’t normally something you hear in Big Data discussions. I asked Chandrasekaran what problem data virtualization solved for IoT and other Big Data projects. Sensor data is generally pooled in a data repository or data lake, he explained, but it’s useful without context.

Data virtualization allows you to leverage sensor and other Big Data and add context using other data sources. For instance, if you’re using sensors to monitor vehicles, you might want to combine that with maintenance records to predict when parts need to be changed.



Friday, 06 February 2015 00:00

Is Your CEO Next? The Data Ticking Time Bomb

This morning, I read the news that Anthem Insurance had a massive data breach and that Amy Pascal, who led Sony pictures as co-chairman, was stepping down as a result of Sony’s breach.

I’d just been sent a Varonis study, written by the Ponemon Institute. “Corporate Data: A Protected Asset or a Ticking Time Bomb?” couldn’t be more timely. The danger in not taking data security seriously is growing.

Let’s talk about this report against those events this week.



Thursday, 05 February 2015 00:00

Building the Agile Database

Is fast development the enemy of good development? Not necessarily. Agile development requires that databases are designed and built quickly enough to meet fast-based delivery schedules - but in a way that also delivers maximum business value and reuse. How can these requirements both be satisfied? This book, suitable for practitioners at all levels, will explain how to design and build enterprise-quality high-value databases within the constraints of an agile project.

Starting with an overview of the business case for good data management practices, the book defines the various stakeholder groups involved in the software development process, explains the economics of software development (including "time to market" vs. "time to money"), and describes an approach to agile database development based on the five PRISM principles.



Thursday, 05 February 2015 00:00

12-Step Program for Emergency Managers

There are 12-step programs for many personal issues, so I figured there should be a 12-Step Program for Emergency Managers. I’ve written about our addiction to Department of Homeland Security grants that are administered by FEMA. Therefore it is only natural that we look for ways to escape our addiction and gain control over our individual programs. Getting out of addictive behavior can be difficult. 

Generally the concept of 12-step programs is to acknowledge a higher power and give everything over to its control. The only “higher power” that emergency managers have is FEMA, so we are in a bit of a Catch-22 in that we are trying to escape its grant clutches while at the same time giving our lives over to its control. We should at least try this 12-step program that I’ve adapted from Alcoholics Anonymous.



Thursday, 05 February 2015 00:00

The Inevitable Decline of Shadow IT

Sometimes it seems as if the enterprise is so caught up in preparing for the future that it fails to notice what is happening in the present.

The cloud is a prime example, with most top data executives enamored by visions of limitless, federated infrastructure able to do anyone’s bidding at the touch of a few mouse clicks. In the meantime, however, few are overly concerned by the unorganized spread of data across external cloud platforms, the so-called shadow IT, despite the significant loss of control it represents.

According to CipherCloud, about 86 percent of enterprise applications are now tied to shadow IT, especially those involved in publishing, social networking and career-based functions. This should be of particular concern to the enterprise considering the increasing sophistication of mobile malware and the ongoing spate of massive data breaches. However, many organizations are not even aware of the scope of the problem: One major enterprise in the survey claimed to have only 15 file-sharing apps in use when in reality it was nearly 70.



Thursday, 05 February 2015 00:00

What’s ‘Good Enough’ Data Quality?

When you dig into data quality—and more of you are—you’ll hear a lot about “good enough” data quality. But what the heck does that mean? And how do you know if you’ve achieved it?

Data folks have long understood that data quality is a continuum. Data quality comes with an associated cost and, at some point, that cost is not worth paying to further “perfect” the data; hence, the concept of “good enough” data quality.

That may have made sense in a relational database world, but now … it’s complicated. The data isn’t just being used for reporting, but is also being leveraged in BI and analytics systems. Data has left IT and is being used to drive decisions across the organization. What’s more, data looks different—it’s now social data, sensor data, external data, Big Data.



Thursday, 05 February 2015 00:00

Selfie-Sticks and Risk Assessments

Greetings from Venice and a big thanks to Joe Oringel at Visual Risk IQ for allowing my to post his five tips on working with data analytics while I was on holiday in this most beautiful, haunting and romantic of cities. While my wife and I have come here several times, we somehow managed to arrive on the first weekend of Carnivale, without knowing when it began. On this first weekend, the crowds were not too bad and it was more of a local’s scene than the full all out tourist scene.

As usual, Venice provides several insights for the anti-corruption compliance practitioner, whether you harbor under the Foreign Corrupt Practices Act (FCPA), UK Bribery Act, both, or some other such law. One of the first things I noticed in Venice was the large number of selfie-sticks and their use by (obviously) tourists. But the thing that struck me was the street vendors who previously sold all manner of knock-off and counterfeit purses, wallets and otherwise fake leather goods had now moved exclusively to market these selfie-sticks. Clearly these street vendors were responding to a market need and have moved quickly to fill this niche.



Thursday, 05 February 2015 00:00

Cloud adoption and risks

With faster time to market, massive economy of scale, and unparalleled agility, the cloud is entering enterprises at an unprecedented rate. As a result, hundreds of high risk cloud applications are commonly used across North American and European organizations, says a CipherCloud report. The report details the results of a comprehensive study of cloud usage and risks, compiled from enterprise users in North America and Europe.

‘Cloud Adoption & Risk Report in North America & Europe – 2014 Trends’ includes anonymised data of cloud user activity collected for the full 2014 calendar year, spanning thousands of cloud applications.



Thursday, 05 February 2015 00:00

Another Mega Data Breach

In what is being described as potentially the largest breach of a health care company to-date, health insurer Anthem has confirmed that it has been targeted in a very sophisticated external cyber attack.

The New York Times reports that hackers were able to breach a company database that contained as many as 80 million records of current and former Anthem customers, as well as employees, including its chief executive officer.

Early reports here and here suggest the attack compromised personal information such as names, birthdays, medical IDs/social security numbers, street addresses, email addresses and employment information, including income data.



(TNS) — While many coastal communities in the Tampa Bay area have been spared a catastrophic spike in flood insurance rates for now, local city leaders say they’re preparing for the worst over the long haul.

In Belleair Bluffs on Tuesday, the Florida League of Cities hosted the first in a series of meetings throughout the state to encourage city governments to invest more in flood mitigation programs that can reduce the risk of storm damage and lower federal flood premiums for local residents by an average of 20 percent.

Cities can increase those savings for nearly all residents who carry flood coverage by improving storm-water drainage, enhancing building codes, moving homes out of potentially hazardous areas and effectively communicating about storm danger and evacuation routes.



(TNS) — Colorado Springs is making a pitch to host a new state-funded center for fire research, a technology hub that could help propel Colorado to the forefront of revolutionizing how wildfires are fought.

The Colorado Springs Regional Business Alliance plans to submit a report this week detailing why El Paso County, twice victim of catastrophic wildfire, should be the new home for the fire research center.

While the public eye may have been trained on the Colorado Firefighting Air Corps created last year, a lesser-known aspect of the Centennial-based fleet — the Center for Excellence for Advanced Technology Aerial Firefighting — has been on the wish list for some Colorado Springs leaders for months.



Despite increasing attention to cybersecurity and a seemingly constant stream of high-profile data breaches, the primary security method used in businesses worldwide remains the simple password. According to a recent study, the average person now has 19 passwords to remember, so it is not surprising that the vast majority of passwords are, from a security perspective, irrefutably bad, including sequential numbers, dictionary words or a pet’s name.

A new report by software firm Software Advice found that 44% of employees are not confident about the strength of their passwords. While many felt their usage was either extremely or very secure, the group reported, “our findings suggest that users either remain unaware of the rules despite the hype, do not believe them to be good advice or simply find them too burdensome, and thus opt for less secure passwords.”

Among the biggest password sins employees commit:



Data security has become an even bigger topic in the last year following several high-profile data breaches at consumer companies. And much of the focus been protecting against the breaches themselves. But are there other ways to protect data? MSPmentor recently took a deeper look at a technology called data masking. Here's what we found.

Many banks, government agencies, hospitals, insurance companies and other organizations that manage highly sensitive information are using a technique to hide their data from cybercriminals – data masking. The technique camouflages the real data that you want to protect by interspersing other characters and/or data with it. So the data hides in plain site, but it cannot be seen or discovered.



Enterprises are scrambling to come up with ways to scale their infrastructure to meet the demands of Big Data and other high-volume initiatives. Many are turning to the cloud for support, which ultimately puts cloud providers under the gun to enable the hyperscale infrastructure that will be needed by multiple Big Data clients.

Increasingly, organizations are turning to in-memory solutions as a means to provide both the scale and flexibility of emerging database platforms like Hadoop. Heavy data loads have already seen a significant performance boost with the introduction of Flash in the storage farm and in the server itself, and the ability to harness non-volatile RAM and other forms of memory into scalable fabrics is quickly moving off the drawing board, according to Evaluator Group’s John Webster. In essence, the same cost/benefit ratio that solid state is bringing to the storage farm is working its way into the broader data infrastructure. And with platforms like SAP HANA hitting the channel, it is becoming quite a simple matter to host entire databases within memory in order to gain real-time performance and other benefits while still maintaining persistent states within traditional storage.



By Leon Adato

In the corporate environment, end users and, more worryingly, the occasional IT pro, are the first to point the finger of blame at the network when an application is sluggish, data transfer is too slow or a crucial Voice over IP (VoIP) call drops, all of which can have a wider impact on the bottom line.

Issues arise when the IT department looks to blame the network as a whole, rather than work to identify problems that are caused by an individual application running on the network. Poor design, large content and memory leaks can all cause an application to fail, yet IT departments can be slow to realise this.

Many companies are reliant on applications to drive business-critical processes. At the same time, applications are becoming increasingly complex and difficult to support, which puts additional pressure on the network. So, the question remains, when there’s an issue with application performance, is it the network or is it the application? How do you short-circuit the ‘blame game’ and determine the root-cause of an issue so it can be solved quickly and efficiently?




In the past we have often heard that people got involved with business continuity through another career, perhaps drifitng in to it from facilities management or IT security. Now we are finding that more and more people are starting off in a business continuity role; the industry has developed into a career opportunity in its own right and people are joining it straight from school, college or university. In order to develop the industry further and take it forward, we need to inspire and encourage the right people to become business continuity professionals, and where better to do this than in schools.

To meet this aim, the Business Continuity Institute has formed a new partnership with Inspiring the Future, a free service where volunteers pledge one hour a year to go into state schools and colleges and talk about their job, career, and the education route they took. Already to date, over 7,500 teachers from 4,400 schools and colleges and over 18,500 volunteers have signed up.

Everyone from Apprentices to CEOs can volunteer for Inspiring the Future. Recent graduates, school leavers, apprentices, and people in the early stages of their career can be inspirational to teenagers - being close in age they are easy to relate to; while senior staff have a wealth of knowledge and experience to share. Your insights will help to inspire and equip students for the next steps they need to take.

Inspiring the Future is currently running a campaign called Inspiring Women with the aim to get 15,000 inspirational women from Apprentices to CEOs signed up to Inspiring the Future, to go into state schools and colleges to talk to girls about the range of jobs available, and break down any barriers or stereotypes.  For further information click here

Why volunteer in a local school or college?

  • Going into state schools and colleges can help dispel myths about jobs and professions, and importantly, ensure that young people have a realistic view of the world of work and the routes into it.
  • Getting young people interested in your job, profession or sector can help develop the talent pool and ensure a skilled workforce in the future.

To sign up to Inspiring the Future as a BCI member, simply click here and follow the steps. In the ‘My Personal Details’ section, under the heading ‘My memberships of Professional Association …’ please write Business Continuity Institute and it will appear for you to select.

By signing up, you make it easy for local schools and colleges to get in touch to see if you can help them help their pupils make better decisions about the future.  You might be asked if you could take part in a careers’ fair, in career networking (speed dating about jobs) or do a lunchtime talk to sixth formers about your job and how you got it. 

Volunteering for Inspiring the Future is free, easy, effective and fun. Volunteers and education providers are connected securely online, and volunteering can take place near home or work as employees specify the geographic locations that suit them. Criminal Records Bureau checks are not needed for career insights talks, as a teacher is always present.

Inspiring the Future is a UK initiative but if you know of a similar scheme in another country then get in touch and let us know. Our aim is to inspire people to become business continuity professionals all across the world.




When he speaks of that Thursday, Nov. 6, 2014, Dan Hoffman’s memory is a blur. Details come back in hazy pieces. His first recollections flash back to a headache, a throbbing pain that drove him into an afternoon nap. Next he recalls the sensations of heat, waking to a baking swelter. Next the glow of flames, a black canopy of smoke above, coughs shaking his lungs, the fire alarm shrieking, attempting to stand, to breathe, to reach for his cellphone and dial 911.

“My instinct was to get out,” Hoffman said.

He stumbled from the bedroom, to the bathroom, to the living room of his family’s home in Traverse City, Mich. The voice of a dispatcher must have spoken to him through his cellphone. He doesn’t recall it though. He only remembers listening to his own voice. He said the word “help” twice. It was the last thing he heard before collapsing, falling unconscious as his house continued to burn.



Tuesday, 03 February 2015 00:00

Measles and the Risk of Infectious Diseases

If you’re reading about the rising number of measles cases in California, you may also be thinking about pandemic risk.

First, let’s look at the status of measles cases and outbreaks in the United States.

The CDC notes that from January 1 to January 28, 2015, 84 people from 14 states were reported to have measles. Most of these cases are part of a large, ongoing outbreak linked to Disneyland in California.

On Friday (January 30, 2015), the California Department of Public Health released figures showing there are now 91 confirmed cases in the state. Of those, 58 infections have been linked to visits to Disneyland or contact with a sick person who went there.

At least six other U.S. states – Utah, Washington, Colorado, Oregon, Nebraska and Arizona—as well as Mexico have also recorded measles cases connected to Disneyland, according to this AP report.

What about last year?



Don’t think you are vulnerable to an insider threat? You might want to have a conversation with your IT department, then. According to Vormetric's 2015 Insider Threat Report, 93 percent of IT personnel think their company is at risk from an insider threat. Also, 59 percent of respondents worry about privileged users or employees who have high-level access to very sensitive data, who are considered to be the company’s greatest threat.

Thanks in part to the recent Sony hack, insider threats and the dangers they pose are getting a lot more attention than they have in the past. But as Eric Guerrino, executive vice president of the Financial Service Information Sharing and Analysis Center, was quoted in eSecurity Planet, insider threats have been a problem for a long time and a top focus area for security concerns. It’s just that now those beyond IT and security staff are beginning to grasp the severity of the issue.



A survey of New South Wales Shires and Councils has looked at risk management, business continuity, and internal audit practices and identified a number of gaps in some critical areas. Over 50 percent of NSW councils participated in the survey, which was conducted by InConsult.

“The high number of responses has provided data that we believe to be valid and paints a good picture of the current state of risk management in NSW councils” says InConsult Director Tony Harb.

“Overall, we have seen improvements across the board in risk management practices, such as developing formal risk management policies and strategies, formal risk appetite statements and maintaining comprehensive risk registers. More Councils now class their risk management in the ‘proficient’ category of risk management maturity.



Many CEOs tend to see business continuity management purely within the context of complying with governance codes. But, says Leigh-Anne van As, business development manager at ContinuitySA, CEOs also need to see how business continuity management can help them answer three key strategic questions.

Van As argues that CEOs need to be able to answer ‘yes' to three key questions:

  • Do you know which products and services offered by your company are vital to ensuring its strategic objectives can be met?
  • Is your organizational structure aligned to the company's strategic objectives?
  • Do you know exactly which resources (including human resources) are required for the company to achieve its strategic objectives?

"Companies typically offer a multiplicity of products and services, but CEOs and their immediate teams need to understand which ones are absolutely vital to the company's ability to meet its strategic targets. They also need to understand exactly which resources are essential to delivering those products and services," she explains. "Once they have the answers, CEOs and their teams can allocate investment and attention appropriately, and optimise the company's operations."



Tuesday, 03 February 2015 00:00

DDoS attacks proving costly for businesses


According to a study conducted by Kaspersky Lab and B2B International, a Distributed Denial of Service (DDoS) attack on a company’s online resources might cause considerable losses – with average figures ranging from $52,000 to $444,000 depending on the size of the company. For many organizations these expenses have a serious impact on the balance sheet as well as harming the company’s reputation due to loss of access to online resources for partners and customers.

According to the study, 61% of DDoS victims temporarily lost access to critical business information; 38% of companies were unable to carry out their core business; 33% of respondents reported the loss of business opportunities and contracts. In addition, in 29% of DDoS incidents a successful attack had a negative impact on the company’s credit rating while in 26% of cases it prompted an increase in insurance premiums.

DDoS attacks are not just costly, they are also becoming more frequent and more complex. In a different study, one carried out by Arbor Networks, it was revealed that 38% of respondents to a survey experienced more than 21 attacks per month compared to just over 25% in 2013. It was also noted that we are now experiencing much larger attacks, sometimes over 100Gbps and even up to 400Gbps. Ten years ago the largest attack was 8Gbps.

With this as a backdrop, it is perhaps no surprise that cyber attacks have consistently been one of the top three threats for business continuity professionals according to the Business Continuity Institute’s annual Horizon Scan report.

“A successful DDoS attack can damage business-critical services, leading to serious consequences for the company. For example, the recent attacks on Scandinavian banks (in particular, on the Finnish OP Pohjola Group) caused a few days of disruption to online services and also interrupted the processing of bank card transactions, a frequent problem in cases like this. That’s why companies today must consider DDoS protection as an integral part of their overall IT security policy. It’s just as important as protecting against malware, targeted attacks, data leak and the like,” said Eugene Vigovsky, Head of Kaspersky DDoS Protection, Kaspersky Lab.



Most actuaries know about projections that go awry, so we have quite a bit of sympathy for the weather forecasters who missed the mark early this week, says I.I.I.’s Jim Lynch:

Weather forecasts have improved dramatically in the past generation, but this storm was odd. Usually a blizzard is huge. On a weather map, it looks like a big bear lurching toward a city.

This storm was relatively small but intense where it struck. On a map, it looked like a balloon, and the forecasters’ job was to figure out where the balloon would pop. They were 75 miles off. It turned out they over-relied on a model – the European model, which had served them well forecasting superstorm Sandy, according to this NorthJersey.com post mortem.



If you’ve ever wondered whether your data governance committee is covering the right issues, then you’ll want to read Joey Jablonski’s recent column, “12 Step Guide for Data Governance in a Cloud-First World.”

Despite the title, five of the steps are actually a great strategic discussion list for any data governance group. Jablonski says organizations should cover each of the following:



Monday, 02 February 2015 00:00

A Strange Diagram

I found this – and have never seen it before:


It’s a strange thing as it appears to begin at the top of the cycle with ‘Corporate responsibility’.  While I understand the definition (here’s one : Corporations have a responsibility to those groups and individuals that they can affect, i.e., its stakeholders, and to society at large. Stakeholders are usually defined as customers, suppliers, employees, communities and shareholders or other financiers. (Financial Times Lexicon)) – is it something that should be at the core of the diagram rather than part of a security management cycle?  I’m not splitting hairs here – it is about separation of process from strategy I think.  Further – shouldn’t ‘Understand the Organization’ come first?  It does for me – unless we understand the organization how can we meet our responsibilities – corporate, security or otherwise?



A growing hazard has emerged in the cloud security space that is threatening organizations from inside of their own physical and virtual walls. As employees across multiple industries continue to adopt ‘shadow cloud’ services in the workplace, organizations and managed service providers (MSPs) need to carefully monitor its effects on security and cloud-based file sharing.

The Cloud Security Alliance’s (CSA), official definition of “shadow cloud” services is “cloud applications and services adopted by individual employees, teams, and business units with no formal involvement from the organization’s IT department.”  The threat of this unsanctioned cloud usage is a potential security risk to both individuals and enterprises, alike, as the services are less protected and secured.



Robin Murphy is a leader in the field of disaster robotics, having started working on the topic in 1995 and researching how the mobile technologies have been used in 46 emergency responses worldwide. She has developed robots that have helped during responses to numerous emergencies, including 9/11 and Hurricane Katrina. As director of the Center for Robot-Assisted Search and Rescue at Texas A&M University, Murphy works to advance the technology while also traveling to disasters when called upon to help agencies determine how robots can aid the response. The center’s first deployment was in response to 9/11, which also was the first reported use of a robot during emergency response.

Emergency Management: Since 9/11, how have you seen the use of robots in disasters change?

Robin Murphy: We started out in 2001 and up until 2005 you didn’t see the use of anything but ground robots. Everything was very ground-centric, and I think that reflected the state of the technology. For years we had bomb squad robots, which were being made smaller and smaller for military tactical operations so that gave them a tool that was pretty easy to use. Starting in 2005, we saw the first use of small unmanned aerial vehicles that were being developed primarily for the military market and those were very useful. Those have really come up and, in fact, since 2011, I’ve only found one disaster that didn’t use an unmanned aerial vehicle and that was the South Korea ferry where they used an underwater vehicle. So we went from ground robots dominating to about 2005 and then we started shifting toward unmanned aerial vehicles. In about 2007, it became much more commonplace to see underwater vehicles being used. Then starting in about 2011, I think if you have a disaster and you’re an agency and you haven’t figured out a way to use a small unmanned aerial system, it’s kind of surprising.



Friday, 30 January 2015 00:00

10 Expert Tips for Better Data Storage

Better data storage means different things to different people. For some it is all about speed, for others cost is the primary factor. For many it is about coping with soaring data volumes while for some, simplicity and ease of install/use are top-of mind elements.

Whatever your opinion of what better data storage is, here are a few tips on how to improve storage in the coming year.



What worries chief information officers (CIOs) and IT professionals the most? According to a recent survey commissioned by Sungard Availability Services information security, downtime and talent acquisition weigh heaviest on their minds.

Information security
Due to the increasing frequency and complexity of cyber-attacks, security ranks highest among IT concerns in the workplace for CIOs; as a result more than half of survey respondents (51 percent) believe security planning should be the last item to receive budget cuts in 2015.

While external security threats are top of mind for IT professionals, internal threats are often the root cause of security disasters. Nearly two-thirds of the survey respondents cited leaving mobile phones or laptops in vulnerable places as their chief security concern (62 percent), followed by password sharing (59 percent). These internal security challenges created by employees, lead 60 percent of respondents to note that in 2015 they would enforce stricter security policies for employees.



Friday, 30 January 2015 00:00

Public apathy in the path of preparedness

Responses to winter storm Juno seem to show that you cannot please the public when it comes to preparedness. In this article Geary Sikich asks whether business continuity and emergency planners are missing something when it comes to communicating preparedness with the public.

I was supposed to be in Boston presenting at ‘The Disaster Conferences’ on 28 January 2015. Well, the weather just put us out to 19 March 2015 for the now rescheduled Boston conference. I guess that they are still feeling the effects of this week’s blizzard, now named ‘Juno’; that left Boston with over 24 inches of snow. According to the Weather Channel Winter Storm Juno pounded locations from Long Island to New England with heavy snow, high winds and coastal flooding late Monday into Tuesday. The storm is now winding down. The National Weather Service has dropped all winter storm and blizzard warnings for Juno.

Snow amounts in New York have ranged from 9.8 inches at Central Park in New York City to 30 inches on Long Island. The snippets from the Weather Channel and from other news sources barrage us with the details of this latest storm:

  • In Massachusetts, up to 36 inches of snow has been measured in Lunenburg, while Boston has seen 24.4 inches. Juno was a record snowstorm for Worcester, Massachusetts (34.5 inches). Incredibly, 31.9 inches fell in Worcester on Jan. 27, alone!
  • Thundersnow was reported in coastal portions of Rhode Island and Massachusetts late Monday night and early Tuesday.



Thursday, 29 January 2015 00:00

REAL ID Act Catches Up with States

(TNS) — Lawmakers are scrambling to fix a problem that could result in Idaho driver's license holders being denied entry to federal facilities nationwide by the end of the year.

The issue arose last week, when the Idaho National Laboratory began enforcing the REAL ID Act.

The act, adopted in 2005, was a response to the Sept. 11, 2001, terrorist attacks. It tries to limit the availability of false driver's licenses and identification cards by imposing detailed security requirements on states for issuing such cards.

Idaho is one of nine "non-compliant" states, meaning the U.S. Department of Homeland Security isn't satisfied with its efforts to implement the act.

Consequently, Idaho licenses and ID cards can no longer be used to gain entry to nuclear power plants, to restricted portions of the Homeland Security headquarters building or - as of Jan. 19 - to INL and certain other federal facilities (see related story, at right).



(TNS) —When disaster strikes in Palm Beach County, Fla., a team of volunteers trained by county emergency managers can be deployed as the first line of defense, helping their communities with everything from search and rescue to basic first aid to putting out small fires.

They can also be called upon to distribute or install smoke alarms, hand out disaster education materials or replace smoke alarm batteries in the homes of the elderly, according to a brochure about the program.

But there's no requirement that they be subject to any kind of criminal background check.

That could change after a concerned Boynton Beach resident complained to the Florida Division of Emergency Management's Inspector General. In a report released last week, the inspector recommended that background checks be a condition of the grants doled out for the program.



Business continuity and cloud file sync services provider eFolder, has announced the release of its production version of Cloudfinder for Box, a dedicated cloud-to-cloud backup, search and restore service for Box. The company rolled out the production version of the offering following Box’s (BOX) long-anticipated initial public offering last week.

The production version of Cloudfinder for Box builds on a Freemium version that was available last year.  eFolder completed its acquisition of Cloudfinder in Q3 of 2014.



The Business Continuity Institute is pleased to announce the launch of its new Careers Centre, providing those working in the industry with the support they need to further their career by highlighting the job opportunities available. The BCI Careers Centre will also allow recruiters to find the perfect candidate for them by offering a CV search facility.

If you’re looking for a new job in business continuity or resilience then look no further than the BCI Careers Centre. Powered by JobTarget, the Careers Centre pulls in advertised vacancies from global recruitment sites, as well as those advertised directly with the BCI, and allows users to search by position or location. The system also allows users to set up a job alert so they can be the first to see new vacancies.

If you’re a recruiter then post your job within the Careers Centre to make sure it can be seen by a wide selection of desired candidates. If you’d rather seek people directly then search through the CVs uploaded by business continuity professionals to find the one who is suitable for you, or perhaps a selection that you would like to shortlist. The BCI Careers Centre is an open site with business continuity and resilience specialists from around the world encouraged to register for vacancies.

As the Careers Centre is specifically designed to focus on roles in the business continuity and resilience industry, it might be helpful to know what industry memberships or credentials a potential employee has. If you're a member of the BCI or hold a BCI credential then this will be clearly identified on your profile. It will also be clearly identified if you are on the BCI's CPD scheme.


Big Data will bring new challenges to data governance. Succeeding will require organizations to simplify, prioritize and above all adapt as Big Data use matures.

Yesterday, I shared four Big Data governance challenges:

  • Changing data roles
  • Broader business involvement
  • Business buy-in
  • Technical challenges

Let’s look at how those success principles can be applied to the first two Big Data governance challenges.

- See more at: http://www.itbusinessedge.com/blogs/integration/a-three-step-strategy-for-tackling-big-data-governance-challenges.html#sthash.ewxFgvwx.dpuf

Big Data will bring new challenges to data governance. Succeeding will require organizations to simplify, prioritize and above all adapt as Big Data use matures.

Yesterday, I shared four Big Data governance challenges:

  • Changing data roles
  • Broader business involvement
  • Business buy-in
  • Technical challenges

Let’s look at how those success principles can be applied to the first two Big Data governance challenges.

- See more at: http://www.itbusinessedge.com/blogs/integration/a-three-step-strategy-for-tackling-big-data-governance-challenges.html#sthash.ewxFgvwx.dpuf

Big Data will bring new challenges to data governance. Succeeding will require organizations to simplify, prioritize and above all adapt as Big Data use matures.

Yesterday, I shared four Big Data governance challenges:

  • Changing data roles
  • Broader business involvement
  • Business buy-in
  • Technical challenges

Let’s look at how those success principles can be applied to the first two Big Data governance challenges.

- See more at: http://www.itbusinessedge.com/blogs/integration/a-three-step-strategy-for-tackling-big-data-governance-challenges.html#sthash.ewxFgvwx.dpuf



Is there anything that can’t be connected to the Internet? For example, where I once wore a $10 pedometer clipped to the waistband of yoga pants, I now wear a $130 fitness tracker on my wrist. In the past, I just took a look at the numbers on the pedometer to see how many steps I’d taken; now I need to log onto an app on my smartphone to see how far I’ve walked and how many calories I’ve burned and even how well I’ve slept. Or, if I wanted to, I could turn on any light in the house from the comfort of my couch rather than get up and do so manually. And that’s just a small scratch on the surface of the phenomena that is known as the Internet of Things (IoT).

However, if we know that virtually everything can now be connected to the Internet, we have to recognize its corollary statement: everything that can be connected to the Internet can be hacked. That fitness tracker I’ve come to depend on? Most of the information transmitted isn’t done securely and the apps have been known to have vulnerabilities. According to Symantec, this could make my movements easy to track and make my login details easy to steal. Those smart light bulbs, according to Slate, have insecure transmitters that could share too much information. And what about the home security system you have … you know, the one you turn on and off with your smartphone?



Wednesday, 28 January 2015 00:00

Big Data: Four New Governance Challenges

Before you move forward with Big Data, you’ll need to evolve your approach to data governance, experts say.

By now, most organizations are familiar with the basics of data governance: Identify the data owner, appoint a data steward, and so on. While those concepts are still essential to data governance, Big Data introduces new challenges that will require new adaptations.

“The arrival of Big Data should compel enterprises to re-think their approach to conventional data governance,” writes Dan O’Brien for Inside Analysis. “Everything about Big Data – its context, provenance, speed, scale and ‘cleanliness’ – extends data governance far beyond traditional, rigid databases, where it’s already an issue.”

Here’s a look at the new challenges Big Data introduces:



There has been innovation in every aspect of how individuals prepare for major snow storms – everything from funky new snow removal devices to new ways of pre-treating road surfaces for anti-icing before the onset of a major storm. Now, the real promise is in taking some of Silicon Valley’s hottest technologies — the Internet of Things, artificial intelligence, crowdsourcing, renewable energy and autonomous vehicles — and using them to improve the way cities respond to blockbuster snow events such as the Blizzard of 2015:



It was an unprecedented step for what became, in New York City, a common storm: For the first time in its 110-year history, the subway system was shut down because of snow.

Transit workers, caught off guard by the shutdown that Gov. Andrew M. Cuomo announced on Monday, scrambled to grind the network to a halt within hours.

Residents moved quickly to find places to stay, if they were expected at work the next day, or hustle home before service was curtailed and roads were closed.

And Mayor Bill de Blasio, whose residents rely upon the transit system by the millions, heard the news at roughly the time the public did.

“We found out,” Mr. de Blasio said on Tuesday, “just as it was being announced.”



Marshall Goldsmith, an executive coach to the corporate elite, is the author of the very popular book called What Got You Here Won’t Get You There. And while the title may be true as it relates to your individual career path, I have news for C-suite executives everywhere: it is not true when it comes to adopting new technology. In fact, what got you here – to your current state of success – is precisely what will get you to the next level. The problem is, as Chief Information Officers (CIO) and IT professionals, we sometimes allow ourselves to be pressured into acting contrary to what we know is the right thing to do.

Here’s what happens. A CEO approaches a CIO and says (in a nutshell), “What’s our cloud strategy? We have to get everything into the cloud.” The CEO has read the analysts, seen the marketing materials, been to the trade shows, and talked to peers. Is it any wonder that he or she comes to the CIO with an urgent “let’s-move-it-all-before-we-get-left-behind” deliverable? The cloud is the newest, latest, greatest, sexiest thing out there. It has benefits galore. Let’s get in on this. Now.



Were most of the data breaches that occurred in the first half of last year preventable? According to the Online Trust Alliance (OTA), a nonprofit organization that provides businesses with online security best practices, 90 percent of these incidents "could have easily been prevented."

And thanks in part to its recent findings, the OTA sits atop this week's list of IT security newsmakers to watch, followed by Adobe (ADBE) Flash Player, Kaspersky Lab founder Eugene Kaspersky and St. Peter's Health Partners.



Tuesday, 27 January 2015 00:00

The inevitability of a cyber attack


Research published by ISACA has shown that close to half (46%) of respondents to a global survey of IT professionals expect their organization to face a cyber attack in 2015 and 83% believe cyber attacks are one of the top three threats facing organizations today. Despite this, 86% say there is a global shortage of skilled cyber security professionals and only 38% feel prepared to fend off a sophisticated attack.

It is not just IT professionals who are worried about cyber attacks, the Business Continuity Institute’s own Horizon Scan report showed that cyber attacks and data breaches are two of the greatest threats to organizations. It is therefore vital that they have systems and people in place to combat these threats or, should any attack be successful as they all too often are, have processes in place to manage the aftermath.

Data breaches at a series of high profile retailers in 2014 made the issue of data security particularly visible to consumers and demonstrated the struggles that companies face in keeping data safe. Finding and retaining skilled cyber security employees is one of those challenges. In fact, 92% of ISACA’s survey respondents whose organizations will be hiring cyber security professionals in 2015 say it will be difficult to find skilled candidates.

“ISACA supports increased discussion and activity to address escalating high profile cyber attacks on organizations worldwide,” said Robert E Stroud, International President of ISACA. “Cyber security is everyone’s business, and creating a workforce trained to prevent and respond to today’s sophisticated attacks is a critical priority.”



On the surface (pardon the pun), NASA’s recent move to the cloud would not seem to have much to do with MSPs who offer cloud-based file sharing. But a closer look into the high-profile project – as recently highlighted on GigaOm – proves otherwise. 

Indeed, there are some things that all cloud transitions have in common, whether it’s the nation’s space program or a 10-person SMB. To illustrate our point, we wanted to examine this story through the lens of a managed service provider and their clients. Here we go…



(TNS) — From intuitive improvements — such as better statewide communication and pre-storm protocols — to more sensible plow blades and smarter technology for plow truck drivers, the crews at the Pennsylvania Department of Transportation’s (PennDOT) District 9 are becoming more equipped each year to handle Pennsylvania weather as efficiently as possible.

“The key words are ‘situational awareness,’” said Walter Tomassetti, assistant district executive for PennDOT’s District 9, which includes Cambria and Somerset counties. “The focus is on being ahead of the storm.”

Now, when PennDOT officials see major weather coming, such as double-digit snow, representatives from each district statewide have a pre-storm meeting to cover what resources will be needed most — and where. Depending on what’s expected, they also may set up a command center in each district.



(TNS) — There's something that has appeared on the Diamond School District campus that is so anticipated that it's drawing youngsters away from their recess to watch it in action.

It's a bulldozer, and it's turning ground outside the elementary school in preparation for a new safe room — Diamond, Mo.'s first official community storm shelter.

"If we could do something about it, then let's do it," Superintendent Mike Mabe said of his school district's proposal to try to maximize safety in case of severe weather. "It's just the right thing to do."



The recent collapse of an Interstate 75 overpass in Cincinnati, killing a worker and injuring a truck driver, is yet another reminder of the plight of America’s infrastructure, which is estimated to require billions of dollars to bring up to 2015 standards.

The bridge that collapsed had been replaced and was being torn down as part of an extended project to increase capacity on a congested, accident-prone section of the interstate, according to the Associated Press.

President Obama, speaking today in Saint Paul, Minnesota, outlined several proposals, including launching a competition for $600 million in competitive transportation funding and investing in America’s infrastructure with a $302 billion, four-year surface transportation reauthorization proposal, according to a press release from the White House. Obama also plans to “put more Americans back to work repairing and modernizing our roads, bridges, railways, and transit systems, and will also work with Congress to act to ensure critical transportation programs continue to be funded and do not expire later this year.”



Federal leaders want to like the cloud. They really do.

Then again, they have to — they’re under a cloud first mandate. And yet, they’re still not gung-ho when it comes to actually pursuing adoption, a recent survey shows.

Every year, MeriTalk surveys federal managers about cloud adoption. In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.

For the most part, they’re shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent). They’re not moving traditional business applications, custom business apps, disaster recovery ERP or middleware.

And it seems they’re pretty happy with that so far. This year, 75 percent said they want to migrate more services to the cloud — except they’re worried about retaining control of their data.



Tuesday, 27 January 2015 00:00

Winter Storms and Power Outages

As the blizzard of 2015 starts to hit hard across the Northeast, with several feet of snow, intense cold and high winds expected, utility companies are warning of widespread and potentially lengthy power outages across the region.

In New Jersey, utility companies say it’s the high winds, with gusts of up to 65 mph, rather than the accumulation of snow, that are likely to bring down trees or tree limbs and cause outages.

Consolidated Edison inc. which supplies electricity to over 3 million customers in New York City and Westchester county, told the WSJ that the light and fluffy snow expected in this blizzard should limit the number of power outages, but elevated power lines could come down if hit by trees.



The answer to this question depends on how fast you want your data back and how much time and effort you are prepared to spend. If your data is both mission and time critical, then full, frequent backups possibly with mirrored systems for immediate restore or failover may be the only solution. Financial trading organisations, large volume e-commerce sites and hospital emergency wards are examples. Other users who do not want to or cannot go down this route will be faced with more basic options.



Advice from James Leavesley, CEO, CrowdControlHQ.

Social media is no longer the exclusive preserve of the ‘Facebook Generation’ eager to connect with each other or simply a channel for consumer advertisers. It is fast becoming a valuable multi-faceted communications tool with many industries actively using social media networking sites to promote their products and services and drive commercial success.

Mirroring the trend, the finance industry is also waking up to the power of engaging with customers through social media at a time when its clients are increasingly turning to online resources for information and advice. Last year, consultancy giant Capgemini forecast that social media was on its way to becoming a “bona fide channel for executing transactions” and previously a study by Accenture stated that half of US financial advisers had successfully used social media to convert enquiries into clients. So far, so good so what’s the catch?



Information security has become a fixture in the daily headlines, ranging from the latest high-profile data breach; to exotic hacks of USB drives, ICS devices and IOT systems; and new zero-day exploits and attack techniques. While these stories are interesting and help us understand the vulnerabilities and risks that make up the threat landscape, they reflect a frequent bias in the industry towards focusing on the ‘cool’ exploit and detection side of cyber-defense, rather than the more operational response and mitigation side. One of the results of this focus, as reported in a recent SANS study, is that for over 90 percent of incidents, the time from incident discovery to remediation was one hour or longer.

This appears to be changing, however, as new reports shine a spotlight on incident response as both welcome and essential, and now courts are reinforcing that sentiment. This article by Proofpoint considers the other side of the equation and look at incident response. A comprehensive view of threat management includes people, processes, and tools in a process outlined below.



By Sal DiFranco

Misrepresentation isn’t reserved for entry-level interviewees. Chief Information Officer (CIO) candidates can exaggerate their accomplishments with the best of them. Let’s say you and your fellow C-suite executives need to hire a CIO. You know what you want – that picture-perfect ideal CIO candidate. Someone who is current on technology while being business savvy. Someone who takes smart risks when it comes to new technology, but who has insight on when to maintain the systems already in place. Someone who can talk to any segment of the business in their own terms, rather than resorting to technical jargon.

Of course, when interviewing CIO candidates, they will all try to make you believe they are that ideal CIO. It is up to you to identify any bull that get tossed around during the interview process, which is why I’ve come up with five specific points to watch out for.



(TNS) — Gov. Jerry Brown’s office is urging state emergency and law enforcement agencies to take advantage of a system that uses cellphone towers to pinpoint and send alerts.

Established in 2012 through a collaboration between the Federal Emergency Management Agency, the Federal Communications Commission and the wireless industry, the Wireless Emergency Alerts system is meant to complement existing alert systems.

“The Wireless Emergency Alerts are just one addition,” said Lilly Wyatt, an Office of Emergency Services spokesperson. “It’s an additional tool that local agencies can use for public messages.”

Of the 58 counties in California, only 24 have signed up to send alerts through the system.



Tuesday, 27 January 2015 00:00

The New Reality of Weather Risk

What do you do when you are responsible for the safety of town, county or state residents and forecasts call for drastic weather conditions? Risk professionals can come under criticism if they are overly cautious, yet under-reacting can mean lives are at stake.

Take the current situation here in New York, New Jersey and Connecticut. Predictions called for one- to three-feet of snow and blizzard conditions over a wide swath of the tri-state area and states of emergency were declared. Governor Andrew Cuomo of New York yesterday called for a full travel ban in 13 counties, beginning at 11:00 p.m. Those breaking the ban were subject to fines of up to $300, he said.

“With forecasts showing a potentially historic blizzard for Long Island, New York City, and parts of the Hudson Valley, we are preparing for the worst and I urge all New Yorkers to do the same – take this storm seriously and put safety first,” Gov. Cuomo said.



Monday, 26 January 2015 00:00


“I always knew I was going to be somebody. But now I wish I had been more specific.” – Lily Tomlin

In April 2014 at a conference on “Redefining Roles: Embracing the Patient as Partner,” one of the speakers, a Ph.D. and President of a division of UnitedHealthcare Corporation, began by taking a step back in time to recount the historical evolution of risk management practiced by the leading doctors of the past.

During the early settlement of the United States, the principal medical treatment consisted of “blood letting.”  In the 1700s, during the Yellow Fever epidemic, Benjamin Rush, a physician signatory of the Declaration of Independence, bled 100 to 125 people per day. Other treatments included “purging,” “sweat boxes,” “mercury ointments” and “medicinal hanging.”  The treatments sound worse than the illnesses.

Before anesthesia, medicine was a horror show, with surgery often resulting in death from shock.  Successful amputations were based on the speed and strength of the surgeon often at the expense of the fingers of surgical assistants.



Monday, 26 January 2015 00:00

How Chicago Solved Its Open Data Dilemma

In New York City, obtaining a public data set required an open records request and the researcher toting in a hard drive.

So grab a notepad, Big Apple, and let the Windy City show you how to do open data.

A recent GCN article describes how Chicago simplified the release and updating of open data by building an OpenData ETL Utility Kit.

Before the kit, the process was onerous. Open data sets required manual updates made mostly with custom-written Java code.

That data updating process is now automated with the OpenData ETL Utility Kit. Pentaho’s Data Integration ETL tool is embedded into the kit, along with pre-built and custom components that can process Big Data sets, GCN reports.



GENEVA — The number of people falling victim to the Ebola virus in West Africa has dropped to the lowest level in months, the World Health Organization said on Friday, but dwindling funds and a looming rainy season threaten to hamper efforts to control the disease.

More than 8,668 people have died in the Ebola epidemic in West Africa, which first surfaced in Guinea more than a year ago. But the three worst-affected countries — Guinea, Liberia and Sierra Leone — have now recorded falling numbers of new cases for four successive weeks, Dr. Bruce Aylward, the health organization’s assistant director general, told reporters in Geneva.

Liberia, which was struggling with more than 300 new cases a week in August and September, recorded only eight new cases in the week to Jan. 18, the organization reported. In Sierra Leone, where the infection rate is now highest, there were 118 new cases reported in that week, compared with 184 in the previous week and 248 in the week before that.



ISO 22318 is a guidance document developed by ISO to address Supply Chain Continuity Management (SCCM).  It has been created to complement ISO 22301 the specification for Business Continuity Management Systems and its associated guidance ISO 22313. 

Before Standards are finalised there is a process of review and comment that helps ensure the quality and consistency of the content they contain.

ISO 22318 despite being called a techincal specification is a guidance document that aims to help those managing BCMS programmes better address the challenge of Supply Chain Continuity.



Monday, 26 January 2015 00:00

Make resilience your 2015 resolution

As one of the goals for the New Year, companies should take stock of how resilient they are, and take steps to improve their ability to prevent disasters, and to recover should one occur.

“As part of their business continuity management, companies assess the risks they face, prioritise them and then put mitigation plans in place. That’s prudent and best practice, and something every board should insist is being done on an ongoing basis,” says Michael Davies, CEO of ContinuitySA. “In addition, I think that we all understand that the risk climate is becoming increasingly more complex, and the chances of a totally unexpected ‘Black Swan’ event are becoming more likely, that we think companies also need to see business continuity as a way to build a business that’s resilient by nature, intrinsically prepared to bounce back from anything. Companies should also become more proactive in avoiding disruptions associated with disasters rather than reacting to them when they occur.”
In fact, Davies argues, this type of approach can help executives and their boards enhance their oversight of the company, and discharge their obligation to ensure the company’s long-term sustainability.

The formal business continuity plan and management processes should provide the starting point for setting about building a more resilient organization, says Davies.

“Once you have done your best to pinpoint all the risks and put mitigation plans in place, then it’s time to put measures in place to help ensure you are prepared for the unexpected,” he notes. “Based on ContinuitySA’s own assessment of the risk environment and our experience with clients, we think the following seven initiatives will enhance organizational resilience.”



HOB has published the results of a new survey which set out to quantify employee knowledge and understanding of their organization’s emergency procedures in the event of a natural disaster or an epidemic.

‘An Inside Look at Disaster Recovery Planning’ surveyed 916 employed people in five cities across the United States: Houston, Los Angeles, Miami, New York and San Francisco.

When asked if their place of employment has emergency procedures in place to ensure the security of company information and data, 40 percent of respondents stated their company either does not have systems in place to protect data in an emergency, or they are not aware of the existence of these procedures.



Vision Solutions Inc., has published its Seventh Annual State of Resilience Report. Entitled ‘The Future of IT: Migrations, Protection & Recovery Insights,’ the report looks at trends, opportunities and challenges.

Highlights of the report include:

  • Nearly 75 percent of respondents have not calculated the hourly cost of downtime for their business;
  • For those who experienced a storage failure, nearly 50 percent lost data in the process due to insufficient disaster recovery methods or practices;
  • Nearly two thirds of those surveyed said they delayed an important data migration for fear of downtime or lack of resources;
  • Hosted private cloud is still the most prevalent cloud environment at 57 percent usage; hybrid cloud adoption lags at 32 percent with room to grow;
  • Despite the growing popularity of cloud, nearly two thirds state they do not have high availability or disaster recovery protection in place for their data once it is in the cloud.

The report combines findings from five industry-wide surveys of more than 3,000 IT professionals.

Obtain the report after registration

Businesses face new challenges from a rise of disruptive scenarios in an increasingly interconnected corporate environment, according to the fourth Allianz Risk Barometer 2015. In addition, traditional industrial risks such as business interruption and supply chain risk (46 percent of responses), natural catastrophes (30 percent), and fire and explosion (27 percent) continue to concern risk experts, heading this year’s rankings. Cyber (17 percent) and political risks (11 percent) are the most significant movers. The survey was conducted among more than 500 risk managers and corporate insurance experts from both Allianz and global businesses in 47 countries.

“The growing interdependency of many industries and processes means businesses are now exposed to an increasing number of disruptive scenarios. Negative effects can quickly multiply. One risk can lead to several others. Natural catastrophes or cyber attacks can cause business interruption not only for one company, but to whole sectors or critical infrastructure,” says Chris Fischer Hirs, CEO of Allianz Global Corporate & Specialty SE (AGCS), the dedicated insurer for corporate and special risks of Allianz SE. “Risk management must reflect this new reality. Identifying the impact of any interconnectivity early can mitigate or help prevent losses occurring. It is also essential to foster cross-functional collaboration within companies to tackle modern risks.”



Monday, 26 January 2015 00:00

How To Move Shadow IT Into The Light

Whether you realize it or not, many companies contain workstations with software that is not approved by the information technology (IT) department; instead, it has been adopted and installed by individuals or even, in some cases, entire departments. We call this use of unapproved applications or third party cloud services ‘Shadow IT’ due to its clandestine or covert status.

More often than not, these activities are not malicious in nature: they are merely a means of maintaining productivity when IT response times to support requests are sadly lacking. One key – and often overlooked – aspect of shadow IT is found in development environments where some users/developers are using public clouds to do development work, or running their own open source software in a virtual machine (VM) on someone else’s cloud.



There’s no doubt that managing databases and associated middleware has become more complicated over the years. Given the fact that the number of people with the skills needed to manage that class of IT infrastructure has not risen appreciably, there’s naturally going to be a requirement for increased reliance on automation.

With the unveiling of Oracle Enterprise Manager Cloud Control 12c Release 4, Dan Koloski, senior director of product management and business development at Oracle, says that the company has added a raft of new data governance capabilities designed to make it easier to manage large “data estates.”

The new capabilities include the ability to detect differences across databases to eliminate configuration drift, the capacity to patch fleets of databases at the same time, and tools that optimize the placement of databases based on current workloads and other IT infrastructure constraints and requirements.



Monday, 26 January 2015 00:00

NIMS / ICS Forms – Automation

If you use ICS (Incident Command System) forms – and you’re like most users – you hate them.  While simple in design, the forms can be cumbersome to manage.  Your organization (federal, state, municipal government, gas & oil exploration and transport, public utility, etc.) may be mandated to use ICS to respond to accidents, disasters and even disruptions of normal business operations.  And like many others, you may struggle to manage use of the common ICS forms.


The forms themselves are easy to complete.  The stumbling block is collaboration.  To share an ‘in progress’ ICS form you need to print it, or sharing it visually (on a projection or computer screen).  Both can be difficult when your Operational personnel are not all in the same room.  You may resort to updating ‘in progress’ ICS form manually (from multiple copies of a printed form) – and then have someone compile them MS Word later.  While Word forms are helpful, they lack true automation. That makes collaborative management of ICS forms cumbersome, inefficient – and can lead to errors and omissions of vital information.

If you created your own ICS form ‘wish list’ it would probably include improvements in both efficiency and collaboration:



Friday, 23 January 2015 00:00

Cyber Value-At-Risk

Measures and methods widely used in the financial services industry to value and quantify risk could be used by organizations to better quantify cyber risks, according to a new framework and report unveiled at the World Economic Forum annual meeting.

The framework, called “cyber value-at-risk” requires companies to understand key cyber risks and the dependencies between them. It will also help them establish how much of their value they could protect if they were victims of a data breach and for how long they can ensure their cyber protection.

The purpose of the cyber value-at-risk approach is to help organizations make better decisions about investments in cyber security, develop comprehensive risk management strategies and help stimulate the development of global risk transfer markets.



(TNS) — Despite high-profile computer attacks on Target, Sony and other major corporations, Idaho's director of homeland security said cyberthreats remain the "most important and least understood risk" to government and the private sector.

In a presentation Tuesday to the Senate State Affairs Committee, Brig. Gen. Brad Richy said the potential threats range from defaced or misleading websites to data theft and disruption of public services.

"The vulnerabilities are extreme," Richy said. "A breakdown in IT [information technology] services could take it from that sector into our industrial sector, to our water supply or electrical supply."

Cyberattacks are "a trend that's been going in the wrong direction for quite some time," said J.R. Tietsort, who heads up Micron Technology's global security efforts.



The September arrests/detentions in Australia of suspected Islamic State of Iraq and Syria (ISIS) supporters who had allegedly been planning to kidnap random people, decapitate them and then drape their bodies in the group’s flag and post the entire horrific event live to the Internet has brought to the forefront one of the most serious yet least discussed scenarios in counterterrorism. We term it “Main Street terrorism” and by that we mean terror attacks not on a grand scale, but multiple small attacks carried out by individuals or very small groups in environments where we have traditionally felt safe.

The December hostage situation in Australia is another example. It was an attack on a soft target, a target that would not fit the “traditional” profile of being highly visible or connected to government or military operations, carried out by an individual espousing extremist beliefs but acting essentially alone.

Who remembers the pipe bombs placed in mailboxes throughout the American Midwest during spring 2002? A total of 18 bombs were placed with six of those exploding (injuring four U.S. Postal Service mail carriers and two residents) and 12 others discovered without exploding. Until the suspect was apprehended, how many of us changed our routine for something as mundane as getting the mail because, suddenly, that everyday activity had become potentially deadly?



Cosentry has expanded its disaster recovery-as-a-service (DRaaS) offering to help customers improve their data recovery times.

The data center services provider said its expanded DR service is designed to meet a full range of business recovery point objectives (RPO) and recovery time objectives (RTO), with targets ranging from less than 15 minutes to several days based on application importance and budget.

"We anticipate that our customers will be able to implement a disaster recovery solution that meets their own specific requirements as it pertains to availability and the potential for data loss at a price that meets their budget," Craig Hurley, Cosentry's vice president of product management, told MSPmentor. "Our service expansion also looks to address the reality that many of our customers are looking to protect both virtual and physical servers."



Friday, 23 January 2015 00:00

Sharing the plan with employees


A new study titled ‘An inside look at disaster recovery planning’ has revealed just how little employees know about their organization’s planned response to a crisis. In a survey by HOB, 40% of respondents stated that their company either does not have systems in place to protect data in an emergency, or they are not aware of the existence of these procedures.

The report also revealed that, even if a plan does exist, 52% of employees are unaware of the details. This study shows just how important it is for the details of any plan that involves employees to be shared with them. The worst time to find out what to do in a crisis is once the crisis has occurred.

Over the last decade we have seen a tendency towards more flexibility working environments and a greater trend towards working remotely, however 45% of respondents noted that they either do not have the ability to access company information that will enable them to do so, or they just don’t know if they have access.

If working remotely is one of your possible responses to a crisis, does your organization have the capability to do this? If your office is out of action and Plan B is for employees to work from home, you might be in for a surprise if 45% of your employees suddenly find out they can’t.

“For most businesses, access to and the sharing of information is critical to ongoing successful operations,” said Klaus Brandstätter, CEO of HOB. “The survey revealed that most companies are unprepared to withstand the negative consequences of disrupted operations, as many employees won’t have access to the resources and information needed to remain functional in emergency situations. In today’s world with so many unforeseen pending disasters, it is clearly paramount that companies implement comprehensive disaster recovery plans as part of their overall business continuity strategy.”


The hybrid cloud is now the new normal in cloud computing. The whole point of a hybrid cloud is to design and customize cloud capabilities that address your customer’s unique needs. But today – MSPs typically offer a one-size-fits-all service level agreement. Customers will demand a service provider that is willing and able to customize the service level agreement to meet those unique needs of their organization so that they can take advantage of the flexibility, scalability, cost reductions, and resiliency that cloud computing offers. 2015 will be the year that customers demand customized SLAs.

Service Level Agreements (SLA) serve as a roadmap and a warranty for cloud services offerings. All cloud providers offer some type of standard, one-size-fits-all SLA that may or may not include the following, depending on your requirements:



Thursday, 22 January 2015 00:00

Ohio Helps Pay for Tornado-Proof Safe Rooms

(TNS) — Mary Kirstein and her partner hunkered down under a dining-room table, with their cat corralled in a laundry basket between them, as the tornado roared toward their home.

And this didn’t happen just once during Kirstein’s nine years in Houston, where tornadoes seem as common as wide-brimmed Stetsons. It happened time and again. Thankfully, she said, the big one never hit, but a person doesn’t easily forget that fear.

“Tornadoes freak me out,” said Kirstein, a purchaser at Battelle who now calls Hilliard home.

In 2012, while researching tornado safety as part of her role on a committee at work, she discovered that the state of Ohio had a new program to help pay for safe rooms that can withstand even the 250 mph winds that accompany the most-destructive EF5 storms. She filled out an application for the Ohio Safe Room Rebate Program, run by the Ohio Emergency Management Agency.



Big Data is quickly moving from concept to reality in many enterprises, and with that comes the realization that organizations need to build and provision the infrastructure to deal with extremely large volumes, and fast.

So it is no wonder that the cloud is emerging as the go-to solution for Big Data, both as a means to support the data itself and the advanced database and analytics platforms that will hopefully make sense of it all.

In a recent survey from Unisphere Research, more than half of all enterprises are already using cloud-based services, while the number of Big Data projects is set to triple over the next year or so. This leads to the basic conundrum that the business world faces with Big Data: the need to ramp up infrastructure and services quickly and at minimal cost in order to maintain a competitive edge in the rapidly expanding data economy. The convergence between Big Data and the cloud, therefore, is a classic example of technology enabling a new way to conduct business, which in turn fuels demand for the technology and the means to optimize it.



Thursday, 22 January 2015 00:00

Putting the Cloud inside Your Company Firewall

Some enterprises are attracted by the potential advantages of the cloud for disaster recovery and business continuity. However, they fear the possibility of information being spied on, stolen or hacked after it leaves their own physical premises. A little lateral thinking suggests another possible solution. Instead of moving outside a company firewall to use cloud possibilities, how about implementing cloud functionality inside the firewall? A number of vendors now offer private cloud solutions and they have some customers whose identity may surprise you.



Component distributor partners with DigitasLBi Commerce and hybris to scale its commerce capabilities in global markets


LONDON – DigitasLBi Commerce, the global connected commerce specialist and hybris software, an SAP company and the world’s fastest growing commerce platform provider, have been selected by RS Components (RS), a trading brand of Electrocomponents plc, the global distributor for engineers, to implement a new connected commerce platform. This will enable it to enhance and rapidly scale its B2B eCommerce offerings to an expanding customer base and deliver a highly personalised experience to individual customers in markets around the globe.


Under the agreement, DigitasLBi Commerce will implement the hybris Commerce Suite, a powerful and scalable single-stack commerce platform capable of delivering highly sophisticated B2B features to a global user base. The solution enables RS to further enhance its online B2B functionality while seamless integration with the company’s enterprise architecture, which includes a SAP business intelligence system, will support streamlined business operations and make the faster initiation of go-to-market strategies and new business models possible.


Guy Magrath, Global Head of eCommerce at RS, commented: “eCommerce is a major driver of growth for our business and the entry point for our customers to a long term multi-channel relationship with us. By partnering with DigitasLBi Commerce and hybris we’ll gain the ability to respond faster to new market needs and further exploit the potential of our eCommerce offer to a diverse B2B customer base.”


With operations across 32 countries and a global network of 16 distribution centres worldwide, RS is the world’s largest distributor of electronics and maintenance products, shipping over 44,000 parcels daily. With around 500,000 products available for same day dispatch and serving more than one million customers worldwide, the company is dedicated to helping customers find the right product at the right price.


As a next phase, DigitasLBI Commerce will undertake the global deployment and rollout of a new connected multi-language, multi-currency, multi-site commerce platform that can be adapted fast to changing market conditions. DigitasLBi Commerce’s robust agile implementation approach will enable RS to incrementally advance its eCommerce capabilities.


With 58 percent of global revenues generated online, RS’s ambition is to build a £1 billion plus connected commerce business and DigitasLBi Commerce will support the brand in extending its ‘eCommerce with a human touch’ vision to further improve the online customer experience with innovative B2B functionality that make it even easier for customers to transact.


“With connected commerce at the heart of the company’s operation, RS has to make the online customer experience the best and most relevant in each and every market they do business in,” said Jim Herbert, Managing Partner at DigitasLBi Commerce. “As a leading exponent of global hybris implementations we’re delighted to have been chosen to support RS in extending how it connects to its global audience to reach customers locally, at the point of need.”


The new multi-device optimised commerce platform will power 29 highly localised websites, and finely tune procedures that address specific market requirements. Under the agreement, DigitasLBi Commerce will enable the brand’s global connected commerce team of 100 staff, who oversee online trading, merchandising and behavioural repurchasing (email/offline event triggers across all channels and digital devices), to become fully self-supporting in their utilisation of the hybris Commerce Suite.


“In today’s market where B2B customers expect and are demanding a B2C-like experience, companies - especially industry giants such as RS - require a new breed of solutions that consider the customer interaction across touch points and channels, including that pivotal moment in the journey where a purchase is made,” explained Rob Shaw, Vice President New Business EMEA and MEE, hybris software. “hybris makes it possible to integrate web, customer service, print, mobile and social commerce that will give RS’s customers a more seamless multi-channel shopping experience.”

Now that the dust has settled on the infamous hack of Sony Pictures Entertainment, it would be prudent to take a look back at how the attack was carried out, consider what lessons IT security professionals can learn from it, and formulate a plan to counter a similar attack.

To that end, I recently conducted an email interview with Gary Miliefsky, an information security specialist and founder and president of SnoopWall, a cybersecurity firm in Nashua, N.H. To kick it off, I asked him what the likelihood is that a Sony insider assisted with the attack, and whether it could have even been carried out without the help of an insider. Miliefsky dismissed the insider theory:

While many speculate that the attack on Sony Pictures Entertainment was done by a malicious insider, I believe that the DPRK carried out the attack themselves, originally initiated from IP addresses they lease from the Chinese government. I believe they initially eavesdropped on emails to learn a pattern of behavior for socially engineering a Remote Access Trojan to be installed via email of an unsuspecting employee, inside the network.



In a Jan. 13 presentation to the federal Health IT Policy Committee, Annie Fine, M.D., a medical epidemiologist in the New York City Department of Health and Mental Hygiene, described both the sophisticated software used to track disease outbreaks such as Ebola, as well as how better integration with clinicians’ electronic health records (EHRs) would improve her department’s capabilities.

“In New York City, every day we are on the lookout for unusual clusters of illness. And we receive more than 1,000 reports a day just in my program,” Fine said. Epidemiologists run a weekly analysis to detect clusters in space and time, and use analytics and geocoding to compare current four-week periods with baselines of earlier four-week periods.

“We get a large number of suspect cases reported, and they may be way out of proportion to the number of actual cases,” Fine said. Epidemiological investigations require hundreds of phone calls to providers and labs. “That could be made much less burdensome and efficient if we could have improved integration with EHR data.”



Wednesday, 21 January 2015 00:00

Can You Make Disaster Information Go Viral?

What role could social media play in effectively communicating information about breaking news such as natural disasters and disease outbreaks? It’s not a new question, but one that lacks an easy answer. Researchers and emergency response personnel in San Diego plan to spend the next four years exploring the topic, and what they find may eventually serve as a model for other communities looking to better leverage social media for disaster response.

San Diego County and San Diego State University (SDSU) recently formed a partnership to research and develop a new social media-based platform for disseminating emergency warnings to citizens. The project aims to allow San Diego County’s Office of Emergency Services (OES) to spread disaster messages and distress calls quickly and to targeted geographic locations, even when traditional channels such as phone systems and radio stations are overwhelmed.



Is your business prepared for IT outages? Disaster preparedness is vital for businesses of all sizes, especially for those that want to avoid prolonged service interruptions, and companies that prioritize disaster preparedness can find ways to protect their critical data during IT outages as well.

Managed service providers (MSPs) can offer data backup and disaster recovery (BDR) solutions to help companies safeguard their sensitive data during IT outages. These service providers also can teach businesses about the different types of IT outages, and ultimately, help them prevent data loss.



Whether you are planning a traditional data center build-out or all-new cloud infrastructure, the appeal of white box hardware is difficult to resist.

Provided you need enough of a particular device to benefit from economies of scale, and you have a plan to layer all the functionality you need via software, white box infrastructure can do wonders to reduce the capital costs of any project. Plus, you always have the option to rework the software should data requirements change.

But it isn’t all wine and roses in the white box universe. As IT consultant Keith Townsend noted to Tech Republic recently, white box support costs often emerge as a fly in the ointment. Large organizations like Facebook and Google have the in-house knowledge to deploy, configure and optimize legions of white boxes, but the typical data center does not. It takes a specialized set of skills to implement software-defined server, storage and networking environments, and white box providers as a rule do not offer much support other than to replace entire units, even if only a single component has gone bad. There is also the added cost of implementing highly granular management and monitoring tools to provide the level of visibility needed to gauge a device’s operational status to begin with.



Wednesday, 21 January 2015 00:00

High Performance Data Storage Tips

Talk to many data storage experts about high-performance storage and a good portion will bring up Lustre, which was the subject of a recent Lustre Buying Guide. Some of the tips here, therefore, concern Lustre, but not all.

Use Parallel File Systems

Parallel file systems enable more data transfer in shorter time period than their alternatives.

Lustre is an open source parallel file system used heavily in big data workflows in High Performance Computing (HPC).  Over half of the largest systems in the world use Lustre, said Laura Shepard, Director of HPC & Life Sciences Marketing, DataDirect Networks (DDN). This includes U.S. government labs like Oakridge National Lab’s Titan, as well as British Petroleum’s system in Houston.



To small business owners, the buzz words from the Big Data world (i.e., petabytes, zettabytes, feeds, analytics, etc.) seem very foreign indeed. According to research from the SMB Group, only 18 percent of small businesses currently make use of Big Data analytics and business intelligence solutions. On the other hand, midsize businesses have shown greater adoption, with 57 percent of those surveyed reporting that they use BI and analytics to gain actionable information.

However, many Big Data vendors have begun creating a better story for smaller businesses, focusing more on how they can use their tools to achieve deeper insight into business data to help them make more informed decisions. And the ones that listen to this retooled message will receive a decent payoff for their efforts.



You’ve taken the time to implement a disaster recovery (DR) plan for your company – you’re prepared for anything. You’ve covered all the milestones, including:

  • Performing a Business Impact Analysis (BIA) to determine the recovery times you’ll need for your applications.
  • Tiering your applications and documenting their interdependencies so you know which order your servers should be restored in.
  • Putting your recovery infrastructure in a geographically-diverse data center.
  • Created a comprehensive recovery playbook and tested each and every step.

Bring on the storms … the floods … the power outages … you’re ready. But are you really?



Wednesday, 21 January 2015 00:00

10 steps to cyber security


The United Kingdom’s GCHQ, in association with the Centre for the Protection of National Infrastructure, Cabinet Office and Department for Business Innovation and Skills, has re-issued their ’10 Steps to Cyber Security’ publication, offering updated guidance on the practical steps that organizations can take to improve the security of their networks and the information carried on them.

Originally launched 2012, the guidance has made a tangible difference in helping organizations large and small understand the key activities they should evaluate for cyber security risk management purposes. The 2014 Cyber Governance Health Check of FTSE 350 Boards showed that 58% of companies have assessed themselves against the 10 Steps guidance since it was first launched. compared to 40% in 2013.

‘10 Steps to Cyber Security’ has been updated to ensure its continuing relevance in the climate of an ever growing cyber threat. It now highlights the new cyber security schemes and services that have been set up more recently under the National Cyber Security Programme.

The Business Continuity Institute’s Horizon Scan report has consistently shown that cyber attacks and data breaches are two of the biggest concerns for business continuity professionals with the latest report highlighting that 73% of respondents to a survey expressed either concern or extreme concern at the prospect of one of these threats materialising.

Robert Hannigan, Director of GCHQ, said: “GCHQ continues to see real threats to the UK on a daily basis, and the scale and rate of these attacks shows little sign of abating. However despite the increase in sophistication, it remains as true today as it did two years ago that there is much you can do yourself to protect your organisation by adopting the basic Cyber Security procedures in this guidance.”


With more enterprise IT organizations relying on software-as-a-service (SaaS) applications than ever, securing the data that flows in and out of those applications has become a major challenge and concern.

To give IT organizations more control over that data, Protegrity today unveiled the Protegrity Cloud Gateway, a virtual appliance that, once deployed on a server, enables organizations to apply policies to the flow of data moving in and out of multiple SaaS applications.

Protegrity CEO Suni Munshani says it applies a mix of encryption and vaultless tokenization to make sure data residing in a SaaS application can only be viewed by users that have been given explicit rights to see that data. Those rights are assigned using a “configuration-over-programming” (CoP) methodology that allows administrators to configure the gateway without having programming skills.

Support for SaaS applications is provided by accessing the public application programming interfaces (APIs) those applications expose, with support for each additional SaaS application that Protegrity supports taking a few days or weeks to add, depending on the complexity of the project.



A new survey of more than 3,000 IT decision-makers worldwide revealed the majority of businesses are "behind the curve" when it comes to their data protection strategies. The survey showed that most businesses are "not very confident" that they can fully recover their critical data after an IT service disruption, yet they also considered data protection "to be totally critical to their success."



Tuesday, 20 January 2015 00:00

The Modular Approach to a Scalable Cloud

Following up on my previous post regarding hyperscale infrastructure, I feel I should point out that once the decision to go hyperscale has been made, it will most likely take place in a Greenfield hardware environment.

Unless you are already working with a state-of-the-art data facility, any attempt to convert complex, multiformat legacy environments will almost certainly lead to a morass of integration issues. The key benefit to hyperscale is that it is both large and flexible, allowing data executives to craft multiple disparate data architectures completely in software. This is why current hyperscale plants at Google and Facebook rely on bulk commodity hardware.

But as I mentioned last fall, the average enterprise does not have the clout to purchase tens of thousands of stripped down servers and switches at a time, and besides, all those components still need to be deployed, provisioned and integrated into the cluster, which takes time, effort and of course, money.



(TNS) — Until now, North Texas has been one of the least likely places in the country to have an earthquake.

But after the Dallas area suffered a series of more than 120 quakes since 2008, the U.S. Geological Survey is re-evaluating the metroplex’s “seismic hazard” — or the risk of experiencing earthquakes.

This year, for the first time, the USGS will include quakes believed to have been caused by human activity in its National Seismic Hazard Map, which engineers use to write and revise building codes, and which insurers use to set rates.

The map predicts where future earthquakes will occur, how often they will occur and how strongly they will shake the ground.



(TNS) — "A rising tide lifts all boats," John F. Kennedy said, in defense of the government taking on big public works projects for the greater good.

About 10 of Iowa's river towns will share a $600 million pot of state money based on the belief sales tax revenue will rise higher and commercial and residential development will flourish along riverfronts, if protected from flood with sophisticated green and hard infrastructure.

Flooding in Iowa is occurring more often, making the nomenclature 100-year or 500-year flood levels meaningless. The city of Burlington had 500-year floods in 1993 and 2008, a 15-year interval.

Cedar Rapids, which sustained $6 billion of the state's $10 billion flood damage in 2008, led the way in convincing the Legislature to establish a flood mitigation fund.



Tuesday, 20 January 2015 00:00

Defining the Five Lines of Defense

As the Board of Directors focuses its attention on risk oversight, there are many questions to consider. One topic the Board should consider is how the organization safeguards itself against breakdowns in risk management (e.g., when a unit leader runs his or her unit as an opaque fiefdom with little regard for the enterprise’s risk management policies, a chief executive ignores the warning signs posted by the risk management function or management does not involve the Board with strategic issues and important policy matters in a timely manner). As illustrated during the financial crisis, the result of these breakdowns can be the rapid loss of enterprise value that took decades to build.

An effectively designed and implemented lines-of-defense framework can provide strong safeguards against such breakdowns. From the vantage point of shareholders and other external constituencies (an external stakeholders’ view), we see five lines of defense supporting the execution of the organization’s risk management capabilities.1 They are outlined below.



Cyberattacks are clearly on the minds of President Barack Obama, Islamic State jihadists, Sony Pictures execs and the CBS producers who are launching a new show this spring called CSI: Cyber. On Jan. 13, Obama announced plans to reboot and strengthen U.S. cybersecurity laws in the wake of the Sony Pictures hack and the one on the Pentagon's Central Command Twitter account from sympathizers of the Islamic State. Whether a real attack or depicted in television and films like Blackhat, this flood of cyberattacks means that hackers are relentless and more sophisticated than ever before.

I’m not a fear monger by trade but want to sound the alarm that there is another cyber-risk that is looming and warrants attention of our emergency management community and government: electronic health records. The American Recovery and Reinvestment Act of 2009 authorized the Centers for Medicare and Medicaid Services to award billions in incentive payments to health professionals (hospitals, long-term care agencies, primary care, etc.) to demonstrate the meaningful use of a certified electronic health record (EHR) system.

The intent to create EHR systems is to improve patient care by providing continuity of care from provider to provider by creating health information exchanges (HIEs) that allow “health-care professionals and patients to appropriately access and securely share a patient’s vital medical information electronically,” says HealthIT.gov. In addition, financial penalties are scheduled to take effect in 2015 for Medicare and Medicaid providers who do not transition to electronic health records.



As the enterprise tries to make the data center more efficient in the face of rising operating costs, one problem keeps reoccurring: Disparate infrastructure makes it very difficult to determine what systems and solutions are in place and how they interact with each other.

The data center, after all, is a collection of assets, so it only makes sense to have a good idea of what those assets are and how they operate in order to either improve their efficiency or swap them out for new, better assets.

The idea of asset management (AM) in the data center is not new – in fact, it is a bustling business. MarketsandMarkets puts the total value of the AM industry at $565.4 million, with annual growth rates averaging 34 percent between now and 2019 to top out at more than $2 billion. The report segments the market by region, components, services, support and other factors, concluding that efficiency, management, planning and expansion of data footprints are key drivers, while limiting factors include tight budgets, poor awareness of available solutions, and a lack of perceived benefits. And as with most technology solutions these days, established markets in Europe and North America provide the bulk of activity, while emerging markets represent the fastest growth.



For most organizations, employees, or the human resources, account for the largest percentage of total costs. Northeastern University D’Amore-McKim School of Business Distinguished Professor of Workforce Analytics and Director of the Center for Workforce Analytics Dr. Mark Huselid says the workforce often represents fully 60 to 70 percent of all expenses. Quite clearly, the refinement of workforce management, and attempting to “connect human capital and performance with management strategy and business goals” is a keen point of interest for both HR and upper management.

The fact that a Professor of Workforce Analytics position exists is intriguing, and the sort of academic research that the Center for Workforce Analytics conducts may well result in some rather unexpected outcomes for some industries. Consider this idea, for example: “Most organizations tend to invest in talent hierarchically, where senior-level talent gets the most pay, best development opportunities and other professional perks. However, organizations should be managing vertically in who and what really matters – and in measuring and managing the outcomes associated with these processes.”

In the tech world, the idea of investing a higher percentage of pay and perks in less senior and less experienced employees is not foreign. Raising pay rates and bonuses for, say, highly in-demand developers and designers can often be easily justified in shortened time-to-market or other deliverables. In other areas, though, HR and the business would have a hard time with the concept without some solid predictive numbers.



Organizations that reap high return rates on Big Data projects do so by changing operational systems for everybody, rather than “enlightening a few with pretty historical graphs,” according to IT research and consulting firm Wikibon.

How do you do that? You stop using Big Data to drive “using the rear-view mirror.” Instead, you couple Big Data insights with in-memory technologies so you’re “driving with real-time input through the front windshield,” writes David Floyer, a former IDC analyst and co-founder and CTO of Wikibon.org.

Floyer’s lengthy piece on Big Data ROI goes into the technical details on how you piece this together. His technologist background really shows, though, so here are a few terms you’ll need to know to follow it:




More than a third (34%) of IT professionals claim that their organization has suffered a major incident that has required them to implement disaster recovery procedures. In the event of such a disaster or other incident occurring, 51% believe they are only ‘somewhat prepared’ to recover their IT and related assets, and of those who had experienced a major incident, more than half lost data and 11% experienced permanent IT losses.

These were some of the findings in a report published by Evolve IP which also showed that the leading causes of such incidents are hardware outages (47%), environmental disasters (34%), power outages (27.5%) and human error (20%). Perhaps surprisingly, a significant number of organizations continue to use legacy methods for disaster recovery. 45% of those surveyed continue to use backup tapes and 41.5% use additional servers at their primary site as a principal method for disaster recovery.

“For many organizations the question isn’t ‘if’ they will suffer a disaster, it’s ‘when,’” says Tim Allen, Chief Sales Officer of Evolve IP. “As we saw in the survey, when disaster hits, it hits hard typically taking over a day to recover and causing financial as well as data losses.”

The results of this survey demonstrate just why the IT related threats are the biggest concern for business continuity professionals as shown in the Business Continuity Institute’s annual Horizon Scan report. The latest report revealed that 77% of respondents to a survey expressed either concern or extreme concern at the prospect of an IT or telecoms outage occurring.


(TNS) — In the wake of the March 11, 2011, Great East Japan Earthquake, local and prefectural governments around the country rushed to assist the Tohoku region, sending material aid and personnel, while private firms and individuals arrived to volunteer their services wherever they were needed.

Few were as quick to respond as Hyogo Prefecture and the city of Kobe, which had experienced their own earthquake in January 1995, and had worked in the intervening years to become Japan’s premier center for disaster response-related knowledge, and an example that towns, cities and prefectures in Tohoku could use as they attempted to rebuild.

At a recent symposium, held ahead of the 20th anniversary this Saturday of the Great Hanshin-Awaji Earthquake and attended by officials and representatives of nonprofit organizations from Iwate and Hyogo prefectures, Hyogo Gov. Toshizo Ido and Iwate Gov. Takuya Tasso spoke on the administrative and planning challenges governments face when dealing with a large-scale natural disaster.



How to balance the risks and rewards of emerging technologies is a key underlying theme of the just-released World Economic Forum (WEF) 2015 Global Risks Report.

The rapid pace of innovation in emerging technologies, from synthetic biology to artificial intelligence has far-reaching societal, economic and ethical implications, the report says.

Developing regulatory environments that can adapt to safeguard their rapid development and allow their benefits to be reaped, while also preventing their misuse and any unforeseen negative consequences is a critical challenge for leaders.



A new survey from Chicago-based managed security service provider (MSSP) Trustwave revealed that organizations with 1,000 Internet users or fewer spent more than twice as much on IT security on a per-user basis than larger organizations (those with more than 1,000 Internet users).

The survey of 172 IT professionals showed that IT security cost $157 per Internet user in smaller organizations versus $73 per user in larger ones.

Also, Trustwave found that 28 percent of all respondents said they believed they were not getting full value out of their security-related software investments.



Integration isn’t an excuse to avoid trying SaaS enterprise applications, argues principal cloud architect Mike Kavis.

“Sometimes enterprise IT executives think their requirements are so different than those of other companies that they cannot be met by a SaaS provider. This thought process is often nothing more than a poor excuse …” Kavis writes.

Kavis is also now a vice president at Cloud Technology Partners, but I’ve followed his writings for years. Kavis is an industry veteran with extensive experience as an architect and IT analyst.



Forecasting what the IT security landscape will look like in the year ahead has become an annual technology tradition, and following 2014 as the Year of the Data Breach, I think anyone could make a fairly accurate guess as to what the major trend of the New Year will be: more data breaches.

Forty-three percent of organizations reported a data breach in the past year, a figure that Forrester predicts will rise up to 60% in 2015. And it’s not just the frequency of breaches that we will see escalate in the year ahead, but also that malware will be increasingly difficult to dismantle. P2P, darknet and tor communications will become more prevalent, and forums selling malware and stolen data will retreat further into hidden corners of the Internet in an attempt to avoid infiltration.

By now, it is no longer a matter of if your business is going to be breached, but when. The last thing any organization needs as we enter another year of risk, is a blind side. The good news, though, is that there are ways to prevent them if we act immediately.



With the arrival of Ebola in the U.S. came public fear, widespread misinformation, and the ever-present danger of contamination and contagion. While the cases have been isolated, the threat of the virus required state and local leaders to assume unprecedented leadership and extreme diplomacy in dealing with the public, the medical community, and even medical suppliers and contractors, who balked at handling blood samples, soiled linens and hospital waste out of fear of the virus.

But when a virus like Ebola hits a jurisdiction, there is a hefty fiscal price as well. In Texas, Dallas County was the first U.S. locality to deal with the sudden challenge of an outbreak. The impact on the budget was not inconsequential. It cost the county a quarter of a million dollars to gut and decontaminate the one small apartment of the nation’s first Ebola victim, Thomas Eric Duncan -- part of the approximately $1 million the county expended in the first weeks of the crisis.

Unlike with some contagions, the unknowns with Ebola could constitute the gravest challenge. There are surprising gaps in scientists’ knowledge about the virus, including the time it can survive in different environments outside the body. That is information vital to EMTs, solid waste departments, hospitals and clinics, and public and private water and wastewater systems -- as well as public transportation agencies.



Thursday, 15 January 2015 00:00

Mapping for Ebola: A Collaborative Effort

Map of Africa

One of the difficulties faced by teams responding to the current Ebola outbreak in West Africa is identifying individuals and communities residing in remote areas. Existing maps of these regions either do not exist or are inadequate or outdated. This means that basic data like location of houses, buildings, villages, and roads are not easily accessible, and case finding and contact tracing can be extremely difficult.

To help aid the outbreak response effort, volunteers from around the world are using an open-source online mapping platform called OpenStreetMap (OSM) to create detailed maps and map data of  Guinea, Sierra Leone, Liberia, and parts of Mali.

Person mapping at a computer

Commonly referred to as “Wikipedia for maps,” OSM is working toward the goal of making a map of the world that is freely available to anyone who wants to use it. The Humanitarian OpenStreetMap Team (HOT) is a U.S.-based non-profit organization that represents a subset of the OSM community. HOT’s mission is to use OSM data and tools to help prepare and respond to humanitarian disasters. Because OSM data is available for free download anywhere in the world, volunteer mappers generate data that are useful not only to CDC but also to other agencies involved in the Ebola response, such as Doctors Without Borders (MSF), International Red Cross (IRC), and World Health Organization.

Mappers frequently use satellite images to identify villages, houses, paths, and other details that were previously unmapped. The U.S. State Department’s Humanitarian Information Unit (HIU) is supporting HOT and OSM by creating the MapGive.org website, which provides easy-to-follow instructions on how to begin mapping very quickly. Personnel in CDC’s Division of Global Migration and Quarantine (DGMQ) are coordinating with HIU and HOT to support and promote volunteer mapping in affected West African areas where CDC teams are currently working.

Members of Emory’s Student Outbreak and Response Team (SORT) are some of these volunteer mappers. SORT is a graduate student organization that collaborates with CDC and provides hands-on training in outbreak response and emergency preparedness. Ryan Lash, a mapping scientist in DGMQ’s Travelers’ Health Branch, initially contacted SORT for help in August as the number of Ebola cases in West Africa continued to rise. He has since provided two workshops for SORT members, taught a small number of CDC staff, and trained students at the University of Georgia.

Rabies response-EOC

In the 8 months that HOT has been mapping countries with Ebola outbreaks, more than 2,500 volunteers have mapped more than 750,000 buildings and hundreds of kilometers of roads, resulting in detailed maps of affected West African communities. Not only do these maps help first responders and other organizations around the world, they also contribute to the national information infrastructure essential to the recovery and rebuilding of affected regions. The value of OSM was highlighted especially well during the 2010 Haiti earthquake, after which the U.S. State Department decided to promote volunteer mapping as a way for the general public to get involved in humanitarian emergencies.

Volunteer mapping in OSM for HOT can be done by anyone. All you need is a computer, an internet connection, and the time and willingness to learn. Find out more about how you can help here: Learn to MapExternal Web Site Icon

It’s a near daily occurrence for most enterprises—a laptop or server becomes obsolete or unusable. But often the most important step is forgotten before a new media is brought in. How do you ensure that the old device is cleansed of all usable traces of important data before it is disposed of?

Many organizations have internal procedures for disposing of technology, and those steps include wiping hard drives of data or restoring a device to its original status before use. But does this alone ensure that no discernible traces of private data are left on the media? Are there ways to absolutely be sure that the organization’s confidential information has been completely and absolutely removed? Or is there a level of data removal that may not be complete, but is acceptable?



All business in a competitive market is risk-based, whether or not enterprises admit it. Positive risk indicates opportunities. Negative risk points to the need to take measures to avoid, transfer or mitigate that risk. Banks are a case in point, with risk analysis at the heart of their daily activities as they continually calculate the probabilities of profitability in investments and loans. For enterprises in other sectors, risk may be less in the spotlight, but no less important. All companies need good disaster recovery and business continuity management for instance. Both depend on properly assessing risks and their impact. So how can you tell if senior management is taking risk management seriously?



Thursday, 15 January 2015 00:00

Global Risks 2015

The World Economic Forum has published its annual look ahead at the risks that are likely to dominate in the coming years.

The biggest threat to the stability of the world in the next 10 years comes from the risk of international conflict, according to the 10th edition of the World Economic Forum Global Risks report.

The report, which every year features an assessment by experts on the top global risks in terms of likelihood and potential impact over the coming 10 years, finds interstate conflict with regional consequences as the number one global risk in terms of likelihood, and the fourth most serious risk in terms of impact.




Less than twelve months ago the UK suffered severe flooding in many parts of the country and this is not an infrequent occurrence. During the last five years over half (51%) of businesses have experienced some form of damage through floods, wind and thunderstorms alone and this can often prove costly. The situation could be further exacerbated within smaller organizations as a new study has shown that 46% of small to medium sized businesses (SMBs) haven’t considered a business continuity plan to carry on trading or mitigate losses.

There are nearly five million SMBs in the UK and each one risks suffering on average £38,311 worth of damage because of the elements. As a result, the potential cost to the economy could be as high as £86 billion. Weather chaos means small businesses could also lose over three working days (26 hours) of staff time.

Weather is a threat to many organizations, so much so that in the Business Continuity Institute’s Horizon Scan report, adverse weather came fourth in a list of potential threats with 57% of respondents to a survey expressing either concern or extreme concern at the possibility of their organization suffering a disruption as a result.

The findings come from a survey of 1,000 SMBs conducted by Towergate Insurance and designed to ascertain the impact of bad and unexpected weather on the UK’s mass of smaller firms. The nationwide survey also reveals 43% of UK SMBs either simply do not have cover or do not know whether they are covered in the event of serious bad weather.

Commenting on the findings, James Tugendhat of Towergate Direct, said: “Small businesses are a vital part of the UK economy and can’t afford to lose money due to the unpredictable British weather. Whilst the good old British weather has become a joke, losing large sums of money or business days due to damage is no laughing matter. Making sure businesses are aware of the risks bad weather poses and how to mitigate against it means SMBs can be guaranteed peace of mind and get back to the business of business.”


By Sue Poremba

Hybrid cloud. BYOD. Big Data. Internet of Things. These are terms that have become part of the daily lexicon, not only within the information technology (IT) and cyber security world but also in the main stream. Jargon is integral to IT. They make complicated terms more accessible to the non-technical person, even if they aren’t easier to understand.

Buzzwords are commonplace in IT security, as well, but are they truly understood? As Frank Ohlhorst writes in Tech Republic, “it seems that IT security managers are giving too much power to terms and buzzwords, letting them dictate security best practices.” Ohlhorst goes on to point out that while BYOD is just an acronym that means, simply, Bring Your Own Device (such as when a company allows its employees to use their personally-owned phones, laptops, and other devices to access the network for work purposes), security professionals see it as Bring Your Own Disaster and the beginning of a security nightmare.



It would be interesting to see what would happen if there was another Ebola scare in the U.S. The answer might depend on when it happened and perhaps where the person became infected. But chances are the health infrastructure would handle it, and perhaps respond to another infectious disease outbreak much better, having had the experience that the recent Ebola episodes provided.

That experience included hiccups and communication errors that resulted not in panic but disagreement on the part of some in the health community and alarm in the public. One target of criticism is the Centers for Disease Control and Prevention (CDC), which was confident from the beginning in expressing that hospitals throughout the U.S. were ready to handle Ebola cases and messaging to the public about the difficulty of transmission of the infection. The CDC chose not to participate in this discussion.

When Thomas Eric Duncan, who eventually died, was first found to have Ebola the CDC sought to calm fears and educate the public about the likelihood of the disease spreading by normal contact with an infected individual, and what should be done if someone was thought to have symptoms. It also expressed confidence in the ability of the health infrastructure to deal with an outbreak.



Wednesday, 14 January 2015 00:00

Data Storage Benchmarking Guide

Data storage benchmarking can be quite esoteric in that vast complexity awaits anyone attempting to get to the heart of a particular benchmark.

Case in point: The Storage Networking Industry Association (SNIA) has developed the Emerald benchmark to measure power consumption. This invaluable benchmark has a vast amount of supporting literature. That so much could be written about one benchmark test tells you just how technical a subject this is. And in SNIA’s defense, it is creating a Quick Reference Guide for Emerald (coming soon).

But rather than getting into the nitty-gritty nuances of the tests, the purpose of this article is to provide a high-level overview of a few basic storage benchmarks, what value they might have and where you can find out more.



Post-apocalyptic movies such as “The Road Warrior,” “I Am Legend” and “The Matrix” have long been a Hollywood staple. You will need something more than a backup service to keep your business going in the event of nuclear war or an alien invasion, but for customers of MSPs, disasters are not an all-or-nothing proposition. Instead, they encompass a whole range of large and small incidents that can result in data and service losses. A properly designed disaster recovery system will protect against:

  • Ransomware
  • Accidental deletion
  • Hardware failure
  • Software corruption
  • Power surges, brownouts or outages
  • Lost smartphones, laptops and tablets
  • Fires and fire protection system damage
  • Vandalism
  • Theft
  • And whatever floods, earthquakes, tornados, tsunamis, lightning strikes, hurricanes or blizzards our dear, sweet Mother Nature decides to give us

Here are five critical factors MSPs should keep in mind when setting up their own and their customers’ systems for easy data recovery after a disaster.




When was the last time you conducted a business continuity exercise? Were your colleagues enthusiastic participants? It’s not always easy to get buy-in, either from top management who don’t want to fund it, or from your non-BC colleagues who don’t have time to take part.

This is why 'testing times' was chosen as the theme for Business Continuity Awareness Week as we want to support you in explaining to your colleagues just how important testing and exercising is to the whole business continuity process. To put it simply, a plan that has not been exercised is not a plan! We also want to support you in organizing your exercises, or more to the point we want BC professionals to support each other.

To begin with, the Business Continuity Institute has produced a series of posters that are free to download and can be placed in a prominent location in your workplace to highlight the importance of exercising your plans. Each poster asks the question:

When do you want to find out your business continuity plan doesn’t work?
A) During an exercise?
B) When an incident occurs?

These posters can be found on the BCAW website.


We also plan to post a series of case studies, white papers and other material that would support your case for an exercise or help you in planning one, but for this we need your help. We need the help of those people who do this work on a daily basis. Have you recently run an exercise, then why not submit a case study? It doesn’t have to be lengthy, just say a little bit about what you did. Have you recently conducted some research? Then perhaps you’d like to submit a white paper, it could provide some great publicity for you and your organization.

As with previous years, we are putting together an extensive webinar programme where business continuity experts will discuss the relevant issues relating to the theme and offer the viewer the opportunity to ask questions.

If you would like to become involved with BCAW, either by submitting material or hosting a webinar, or if you would just like further information, please do get in touch by emailing Andrew Scott.

Tuesday, 13 January 2015 00:00

CDC: Flu Season a Bad One

(TNS) — The Centers for Disease Control and Prevention said this year’s flu season is shaping up to be a bad one.

Already there have been 26 confirmed pediatric deaths and flu is widespread in almost “the entire country,” CDC director Tom Frieden said on a conference call with reporters Friday morning.

The number of hospitalizations among adults aged 65 and older is also up sharply, rising from a rate of 52 per 100,000 last week to 92 per 100,000 this week, Frieden said.

And there’s still more possible hospitalizations and deaths to come. The nation is about seven weeks into this year’s flu season and seasons typically last about 13 weeks, Frieden said.

“But flu season is unpredictable,” he said, adding it could last longer than 13 weeks.



Not too long ago, organizations fell into one of two camps when it came to personal mobile devices in the workplace – these devices were either connected to their networks or they weren’t.

But times have changed. Mobile devices have become so ubiquitous that every business has to acknowledge that employees will connect their personal devices to the corporate network, whether there’s a bring-your-own-device (BYOD) policy in place or not. So really, those two camps we mentioned earlier have evolved – the devices are a given, and now, it’s just a question of whether or not you choose to regulate them.

This decision has significant implications for network security. If you aren’t regulating the use of these devices, you could be putting the integrity of your entire network at risk. As data protection specialist Vinod Banerjee told CNBC, “You have employees doing more on a mobile device and doing it ad hoc here and there and perhaps therefore not thinking about some of the risks that are apparent.” What’s worse, this has the potential to happen on a wide scale – Gartner predicted that, by 2018, more than half of all mobile users will turn first to their phone or tablet to complete online tasks. The potential for substantial remote access vulnerabilities is high.



Tuesday, 13 January 2015 00:00

Tackling the Unstructured Data in Big Data

There’s a lot of talk about Big Data as if it is one entity. We hear: How do you manage Big Data? How do you govern Big Data? What’s the ROI for Big Data? The problem with this is that it puts too much focus on the technology, while obscuring one of the major challenges in Big Data sets: the unstructured data. 

I suspect CIOs haven’t forgotten that component since about 80 percent of data in organizations today is unstructured data, according to Gartner. That’s a lot of value currently hiding in social media, customer call transcripts, emails and other text-based or image-based files.

That’s a problem, because that also happens to be where you may find the real value in Big Data. These disparate data sets were previously unanalyzed or sitting in application silos. Obviously, Hadoop will let you migrate that into one location, but what then? How do you turn that into valuable information?

This recent Datamation column by Salil Godika goes a long way toward answering these questions. Godika is the chief strategy & marketing officer and Industry Group head at Happiest Minds. I admit this gave me pause, because pieces by chief marketing officers can be too self-serving.



There are times when you wish you could undo what you just did. Sometimes, you can’t. Financial investments, office reorganisations and even that too-hasty email you sent often cannot simply be reversed. With IT on the other hand, it’s a different story. From individual PCs to corporate data centres, the ‘Undo’ function has become a standard feature of many computing systems for making errors and problems disappear. As little as one mouse click may be enough to turn back the hands of time and begin again as though a mistake had never been made. But is this disaster recovery capability the magical solution it is often made out to be?



The concept of digital transformation is not a new one, as technology has been used to augment business functions since the dawn of the computer age. However, these days, digital transformation means different things to different companies, requiring each company to tailor their integration of technology in a way that increases productivity and improves communication with internal and external parties.

Personally, I like the Altimeter Group’s definition of digital transformation, since it is the most appropriate for modern market-focused usage: “The realignment of, or new investment in, technology and business models to more effectively engage digital customers at every touch-point in the customer experience lifecycle.” In most cases, the goals of digital transformation include better engagement with digital customers, greater collaboration with internal resources, and improved efficiency.




It may not come as a surprise that cyber security incidents are on the rise. Open any newspaper today and you will no doubt come across yet another article highlighting some organization that has become the latest victim of a breach in online security.

Quite by how much these incidents are on the rise is perhaps a little more concerning however, as a recent report produced by PwC on the Global State of Information Security has shown that the number of information security incidents reported by survey respondents has increased from 28.9 million in 2013 to 42.8 million in 2014 – a 48% increase. The report also cites additional research which suggests that 71% of compromises go undetected meaning that 42.8 million could just be the tip of the iceberg.

The Business Continuity Institute’s Horizon Scan report has consistently shown that cyber attacks and data breaches are a major concern for business continuity professionals with the latest survey highlighting that 73% of respondents to a survey expressed either concern or extreme concern at the prospect of one of these threats materialising.

The cost of security incidents can be high with the report noting that a recent study by the Center for Strategic and International Studies estimated that the annual cost of cybercrime to the global economy could be somewhere between $375 billion and $575 billion. The report further notes that this doesn’t cover the cost of IP theft which could range from $749 billion to as much as $2.2 trillion.

You might think that with this significant increase in the number of security incidents and the financial impact these incidents can have, budgets would be also be increasing in order to protect against them, however the opposite appears to be the case as the report reveals that the average security budget among respondents had decreased by 4% from the previous year.

The Global State of Information Security Survey 2015 was a worldwide study by PwC, CIO, and CSO conducted online during the first half of 2014. Readers of CIO, CSO and clients of PwC from around the globe were invited to take the survey and the results discussed in the report are based on the responses of more than 9,700 CEOs, CFOs, CIOs, CISOs, CSOs, VPs and directors of IT and security practices across more than 154 countries.

As companies increasingly turn to the public cloud to house various components of their IT infrastructures, it will probably always be the case that other components will remain on-premise. So the question of how to best manage that hybrid environment becomes one that an increasing number of IT pros will have to be able to answer.

I discussed that question in a recent email interview with Lynn LeBlanc, co-founder and CEO of HotLink, a hybrid IT management software provider in Santa Clara. I started by asking LeBlanc what she finds companies tend to keep on-premise, and why they’re going that route. She said the reasons for hybrid cloud deployments vary from organization to organization, but it’s generally more a question of what they want to put in the cloud:



Never before have there been so many options for alerting the public. In the last few months alone potential for new alerting channels has been unleashed for complementing an already growing array of channels. Names like Google, Twitter, Facebook and the Weather Channel have entered the alerting field. Legacy vendors have enhanced their offerings. The federal government now has impressive alerting success stories to tout. An industry and practice area that once seemed sleepy is wide awake. At the same time, new complexities and challenges have shown themselves.

As part of the move toward ubiquitous alerting, an organization is working to turn online advertisements into emergency alerts. Members of the Federation for Internet Alerts (FIA) are substituting “interest-based advertising” with targeted alerts. Interest-based ads are the ones you see online that know what you’ve been looking for by using Web cookies or mobile service identifiers left behind when you conduct a search or visit a site. Through the FIA plan, interest-based ads would be replaced with emergency alerts for a specific geographic area. The FIA’s Jason Bier, chief privacy officer at the company Conversant, said through a pilot, Amber Alert messages have been exposed via 500 million “impressions” to more than 100 million devices.



Pricing data backup and disaster recovery (BDR) and business continuity services can be challenging, especially for managed service providers (MSPs) that offer cloud-based storage of customer data.

A time-based cloud retention (TBCR) fixed-pricing model, however, ensures the monthly cost for cloud-based storage of customer data does not vary based on volume.

Also, service providers can use TBCR to offer customers secure, rapidly recoverable off-site backup for a fixed monthly price that is based on how long they need to retain their data.



Integration isn’t exactly a fast-moving part of IT, so it isn’t usually listed on New Year technology prediction lists. This year, I spied two integration trends among these lists that could potentially shake up IT and the business.

First, CIO.com lists deeper ERP integration as a top trend for enterprise software in 2015. This could be huge for business users, who could then leverage that rich ERP data for other applications — especially CRM. Jeremy Roche, CEO of cloud ERP provider FinancialForce, explained it thusly:



Friday, 09 January 2015 00:00

Abiding by the rules of business continuity


There are many 'rules' that govern what we do as business continuity professionals – some are sector specific, some are based on geography. But which of them apply to your organization? When you start to look into it, it's not difficult to become confused as to which you are supposed to abide by.

The Business Continuity Institute now aims to simplify this by publishing what we believe to be the most comprehensive list of legislation, regulations, standards and guidelines in the field of business continuity management. This list was put together based upon information provided by the members of the Institute from all across the world. Some of the items may only be indirectly related to BCM, and should not be interpreted as specifically designed for the industry, but rather they contain sections that could be useful to a BCM practitioner.

The ‘BCM Legislations, Regulations, Standards and Good Practice’ document breaks the list down by country and for each entry provides a brief summary of what the regulation entails, which industries it applies to, what the legal status of it is, who has authority for it and, finally, it provides a link to the full document itself.

The BCI has done its best to check the validity of these details but takes no responsibility for their accuracy and currency at any particular time or in any particular circumstances. To download a copy of the document, click here.

Friday, 09 January 2015 00:00

IBM Stays the Storage Course

The overall storage market has seen a number of challenges recently in achieving desired goals, such as in the number of petabytes vendors actually sell. That has led a few prognosticators to express a “sky-is-falling” analysis (as that attracts attention) to the situation. But that approach is fundamentally wrong.

Now, in any dynamic and rapidly changing market such as storage where trends, such as software-defined solutions and Flash technologies are transforming vendor and customer expectations, and where global IT trends, like cloud, big data, and mobile also have an immense impact, there are likely to be challenges. That is especially the case where both established vendors and newer players duke it out.

The key is not to panic. And that is why it is so important to IBM’s storage customers that the company is staying the course. This does not mean standing still, but rather progressing in a measured manner. IBM’s recent 4th quarter storage announcements do not contain any blockbusters. For that we can be grateful as blockbusters absorb all the attention and we have to expend a lot of thought, time and energy in trying to understand what impact the blockbuster will have.



Friday, 09 January 2015 00:00

BYOD: Follow the Money

The topic of money – who pays for what and how to get the best plans when business and consumer activities are mixed – has been a vexing one since Bring Your Own Device (BYOD) emerged. It has taken something of a back seat while companies figured out how to keep data secure and separate in the two spheres.

Those primary tasks are well on their way to being solved. Now, attention in turning, as it eventually always does, back to the money. The industry is getting serious about the issue, at least at the rudimentary level of splitting work and consumer bills.

Mobile Enterprise reports that the AT&T Work Platform will enable organizations to separate work and consumer expenditures. The story says that it is an important task from several points of view. Of course, there is the simple point of figuring out who pays for what. Beyond that are the legal, human resources and tax regulations. AT&T is cooperating with big-name vendors MobileIron, AirWatch by VMware and Good Technology on the platform.



Friday, 09 January 2015 00:00

Why CIOs Will Want Data Lakes

Edd Dumbill may have just won the argument over whether data lakes are a practical, achievable idea.

Data lakes are a simple enough idea: You dump a wide range of data into a Hadoop cluster and then leverage that across the enterprise.

The problem is what Gartner calls the “Data Lake Fallacy,” which is the challenge of managing data lakes in a governable and secure way.

Dumbill acknowledges the barriers to data lake adoption in a recent O’Reilly Radar Podcast. Ultimately, though, the VP of strategy at Silicon Valley Data Science says data lakes will happen for one reason: Data lakes free data from enterprise silos.

“One of the hardest things for organizations to get their head around is getting data in the first place,” Dumbill told O’Reilly’s Mac Slocum. “A lot of CIOs will be, ‘Great, I want to do data science but I’ve got this database over here and this one over here and these all need to speak to each other and they’re in different formats and so on.’ In many ways, having data in a data lake provides you with a foundation (with) which you can start to integrate data with and then make it accessible as a building block in an organization.”



“Pandemic” and “panic” sound a lot alike. Certainly, the first can trigger the second in next to no time, as the recent outbreak of Ebola has demonstrated. But as a leader in your company, you can avoid both by encouraging your cross-functional teams to take the following six steps.



(TNS) — There are chilling similarities between the deadly Charlie Hebdo attack in Paris and the Boston Marathon bombings, with lessons to be drawn for law enforcement, terrorism experts say.

Both attacks have been blamed on homegrown terrorist brothers — in each case with a brother who had drawn law enforcement attention for Islamic radical ties before. In both cases, both police and citizens were targeted with equal cold-blooded vigor.

“I think what you’re going to see is governments going through their watch lists to see how many names appear identical. They should have added worry when you have two or three members of the same family giving prior warning, governments should be taking a second and third look at them,” said Victor David Hanson of the Hoover Institution. “When you are dealing with familial relations, it means there are fewer people who have privileged information about the ongoing plotting and the secret is reinforced by family ties ... it’s going to be much harder for Western intelligence to break into them.”



Thursday, 08 January 2015 00:00

43 States Have 'Widespread' Flu Problems

(TNS) -- Influenza viruses have infiltrated most parts of the United States, with 43 states experiencing "widespread" flu activity and six others reporting "regional" flu activity, according to the Centers for Disease Control and Prevention.

Hawaii was the only state where flu cases were merely "sporadic" during the week that ended Dec. 27, the CDC said in its latest FluView report. One week earlier, California also had been in the "sporadic" category, and Alaska and Oregon reported "local" flu outbreaks. Now all three states have been upgraded to "regional" flu activity, along with Arizona, Maine and Nevada.

The rest of the states are dealing with "widespread" outbreaks, according to the CDC.



Thursday, 08 January 2015 00:00

SMBs Should Consider These Tech Trends in 2015

Of course, the end of 2014 and the beginning of 2015 bring all sorts of articles predicting what will be hot in the coming year. For small to midsize businesses (SMBs), quite a few outlets are reporting their lists of technology trends to watch.

Entrepreneur gave three “promising trends” for 2015, which include creating and leveraging well-designed technology, adopting software as a service (SaaS) and developing “data-driven insights.”

Taking advantage of data to make better informed decisions is also a top trend for SMBs to watch from the Huffington Post. According to writer Joyce Maroney, “Smaller businesses, swimming in lots of data of their own, will likewise be taking more advantage of that data to bring science as well as art to their decision making.” That likely means delving further into more data sources than just Google Analytics, says Entrepreneur writer Himanshu Sareen, CEO of Icreon Tech.



The presence or lack of catastrophes is a defining event when it comes to the financial state of the U.S. property/casualty insurance industry.

At the 2014 Natural Catastrophe Year in Review webinar hosted by Munich Re and the Insurance Information Institute (I.I.I.), we can see just how defining the influence of catastrophes can be.

U.S. property/casualty insurers had their second best year in 2014 since the financial crisis – 2013 was the best – according to estimates presented by I.I.I. president Dr. Robert Hartwig.

P/C industry net income after taxes (profits) are estimated at around $50 billion in 2014, after 2013 when net income rose by 82 percent to $63.8 billion on lower catastrophe losses and capital gains.



Thursday, 08 January 2015 00:00

Survey: business continuity in 2015

Continuity Central’s annual survey asking business continuity professionals about their expectations for the year ahead is now live.

Please take part at https://www.surveymonkey.com/r/businesscontinuityin2015

The survey looks at the trends and changes the profession can expect to see in the year ahead.

Read the results from previous years:

Thursday, 08 January 2015 00:00

Scoping Out Your Program/Risk Assessment

At the PLI Advanced Compliance & Ethics Workshop in NYC in October, Scott Killingsworth of the Bryan Cave law firm noted that each risk assessment should be unique.  I agree, and I believe that the case for uniqueness is even more powerful for the combined program and risk assessments companies sometime undertake.  Given the diversity of possibilities, where should you start in scoping out such an engagement?  Another way of asking this question is “How should you conduct a needs assessment for a program/risk assessment?”

To begin, it may be worth thinking in terms of the following six fields of information which can comprise the subjects of an assessment:



The future of IT infrastructure is changing. My friend, BJ Farmer over at CITOC, is fond of reminding me that Change is the Only Constant (see what CITOC stands for?).

It’s true for most everything in life, and especially true for our industry. You can either embrace the changes that come along, evolving how you present services to your clients, or you can slowly lose relevance and fade out of the big picture. The choice is yours.

Right now, change comes from The Cloud.

Yes, there is definitely a lot of hype about the cloud, and it’s easy to grumble about fads and look at the big cloud migration as a bandwagon everyone’s too eager to jump on. But the plain fact is that the cloud is providing affordable, smart alternatives to the kind of infrastructure that used to be the bread and butter of an MSP, and it’s not going anywhere. So you can either keep railing against the cloud, running your Exchange servers and piecing together various services from different partners, or you can start thinking about how to offer innovative solutions for your clients by STRATEGICALLY leveraging the cloud.



Thursday, 08 January 2015 00:00

Human Error Caused 93% of Data Breaches

Despite tremendous increased attention, the number of reported cyberbreach incidents rapidly escalated in 2014. According to Information Commissioner’s Office data collected by Egress Software Technologies, U.K. businesses saw substantially more breaches last year, with industry-wide increases of 101% in healthcare, 200% in insurance, 44% among financial advisers, 200% among lenders 200%, 56% in education and 143% in general business. As a result, these industries also saw notable increases in fines for data protection violations.

The role of employees was equally alarming. “Only 7% of breaches for the period occurred as a result of technical failings,” Egress reported. “The remaining 93% were down to human error, poor processes and systems in place, and lack of care when handling data.”

Check out more of the findings from Egress’ review in the infographic below:



The recent Ebola outbreak unearthed an interesting phenomenon. A “mystery hemorrhagic fever” was identified by HealthMap — software that mines government websites, social networks and local news reports to map potential disease outbreaks — a full nine days before the World Health Organization declared the Ebola epidemic. This raised the question: What potential do the vast amounts of data shared through social media hold in identifying outbreaks and controlling disease?

Ming-Hsiang Tsou, a professor at San Diego State University and an author of a recent study titled The Complex Relationship of Realspace Events and Messages in Cyberspace: Case Study of Influenza and Pertussis Using Tweets, believes algorithms that map social media posts and mobile phone data hold enormous potential for helping researchers track epidemics.

“Traditional methods of collecting patient data, reporting to health officials and compiling reports are costly and time consuming,” Tsou said. “In recent years, syndromic surveillance tools have expanded and researchers are able to exploit the vast amount of data available in real time on the Internet at minimal cost.”



(TNS) — After a series of 13 small earthquakes rattled North Texas from Jan. 1 to Wednesday, a team of scientists is adding 22 seismographs to the Irving area in an effort to learn more.

The team of seismologists from Southern Methodist University, which has studied other quakes in the area since 2008, deployed 15 of the earthquake monitors Wednesday. SMU studies of quakes in the DFW Airport and Cleburne areas have concluded wastewater injection wells created by the natural gas industry after fracking are a plausible reason for the temblors in those areas.

But Craig Pearson, seismologist for the state Railroad Commission, said that is not the case with the Irving quakes.

“There are no oil and gas disposal wells in Dallas County,” said Railroad Commission of Texas seismologist Dr. Craig Pearson in a Wednesday email.



Wednesday, 07 January 2015 00:00

Frigid Weather Heightens Ice Hazards

Freezing weather now sweeping across much of the U.S. brings a greater risk of ice storms and underlines the need for careful planning and heightened safety measures.

In fact, it does not take much ice to create disaster conditions. Even a thin coat of ice can create dangerous conditions on roads. Add strong winds and you have a recipe for downed trees and power lines, bringing outages that can last for days.



Wednesday, 07 January 2015 00:00

How We Get Work Done: Good Old Email

While attention is focused this week on the CES 2015 show in Las Vegas and all the new technology, gadgets and apps that may change the way we work in the near future, Pew Research has a reminder of the technology that we truly consider indispensable at work: Email and the Internet.

After a survey of 1,066 adult Internet users, Pew Research analyzed results from those who have full- or part-time jobs. When it comes to the digital work lives of these respondents, the findings indicate, the tools designated as “very important” are nothing new. Sixty-one percent named email, 54 percent “the Internet,” and 35 percent a landline phone. Cell phones and smartphones trailed at 24 percent, and social networking sites grabbed a measly 4 percent.

Pew notes that email is still king despite increasing awareness of drawbacks, including “phishing, hacking and spam, and dire warnings about lost productivity and email overuse.” In fact, 46 percent of respondents said they think they are more productive with their use of email and other digital tools; 7 percent say they are less productive. Being more productive, these workers report, includes communicating with more contacts outside the company, more flexible work hours, and more hours worked.



By David Honour

As we enter a new year it’s always a good exercise to look ahead at potential changes in the coming 12 months and what these might mean for existing business continuity plans and systems. Will the strategies you had in place in 2014 remain fit for purpose, or will some reworking be necessary? What emerging threats need to be considered to ensure that new exposures are not developing? In this article I highlight three areas which are likely to be the biggest generic business continuity challenges in 2015.

The rise and rise of information security threats

2014 was the year that information security related incidents took many of the business continuity headlines, with attacks increasing in sophistication, magnitude and impact. This situation is only going to get worse during 2015.

The greatest risk is that of a full-on cyber war breaking out, which would inevitably result in collateral damage to businesses. The first salvoes have been seen in a potential United States versus North Korea cyber war; but other state actors are also well geared up for cyber battle, including Israel, Russia, China and India. The cyber-warfare skills of terrorist groups such as ISIS should also not be under-estimated.



On January 1, 2015, version 3.0 of the PCI (Payment Card Industry) Data Security Standards replaced version 2.0 as the standard. In other words, what some financial institutions, merchants, and other credit card payments industry members already saw as an onerous process—complying with PCI standards and possibly being audited—is about to get even harder. While I can’t take the blood, sweat and tears out of PCI compliance, as an experienced Qualified Security Assessor (QSA) I can give you some context for why PCI is issuing a new version of its standards, and why 3.0 is a good thing for your business in the end.



Industrial-organizational (I-O) psychologists are all about what makes us tick in the workplace, so it’s unsurprising that the Society for Industrial and Organizational Psychology (SIOP) releases an annual “Top 10 Workplace Trends” list. Equally unsurprising, but interesting nonetheless, is that the list for 2015 is highly tech-focused.

Judging from the list, which was compiled on the basis of a survey of SIOP’s 8,000 industrial-organizational psychologists, these folks appear to have a pretty good handle on technology trends, which clearly have had a significant impact on their views of the workplace in the coming year. Here’s their Top 10 list:



(TNS) -- Hydraulic fracturing at two well pads in Mahoning County caused 77 small earthquakes last March along a previously unknown geologic fault, a new scientific study says.

The series of temblors included one quake of magnitude 3 -- rare in Ohio -- that was strong enough to be felt by neighbors, according to the study by three researchers from Miami University.

At the time of the quakes, only five were reported, ranging from magnitude 2.1 to 3.

The new research was published online Tuesday in the Bulletin of the Seismological Society of America. It will be printed in the February-March issue of the bulletin.

The peer-reviewed study of the quakes, which occurred in Poland Township southeast of Youngstown, appears to strengthen the link between small- and medium-sized earthquakes and both hydraulic fracturing (also known as fracking) and the use of injection wells for drilling wastes.



Enterprise organizations are looking to partner with MSPs as they move to the cloud. The key for success is to develop an engagement plan using a high touch process to ensure a smooth onboarding experience during all three phases of the on-boarding process:  The Assessment, Transition Plan and Cutover, and Ongoing Performance Analysis. Like most new technologies, cloud computing can require significant changes in business processes, application architectures, technology infrastructure, and operating models that must be properly understood before embarking on any new initiative. Having a well thought out strategy can mean the difference between success and failure.



(TNS) — The humble infusion pump: It stands sentinel in the hospital room, injecting patients with measured doses of drugs and writing information to their electronic medical records.

But what if hackers and identity thieves could hijack a pump on a hospital’s information network and use it to eavesdrop on sensitive data like patient identity and billing data for the entire hospital?

It is not a far-fetched scenario. Though it hasn’t happened yet, the hacking of wireless infusion pumps is considered a critical cybersecurity vulnerability in hospitals — so much so that federal authorities are focusing on the pumps as part of a wide-ranging effort to develop guidelines to prevent cyberattacks against medical devices.



(TNS) — When Glynn County, Ga., Police Chief Matt Doering began his career nearly three decades ago, the thought of holding an interactive map in his hand would have been like something out of a science fiction novel.

He and the rest of the Glynn County public safety community will see fiction become reality when the county’s new $485,000 computer aided dispatch, or CAD, system goes online next Monday. The county spent and additional $1.1 million to convert decades worth of reports and other information kept in a separate records management system that works with the new software.

“We wouldn’t have dreamed of this,” Doering said. “It is going to be a new mindset.”

His excitement is shared by others because it has been 12 years since the system that helps disseminate information about emergency calls has been updated. In technological terms, that is like a century.



New data from IBM (IBM) showed that despite a decline in cyber attack incidents against U.S. retailers, the number of customer records stolen during cyber attacks remained near record highs in 2014.

IBM reported that cyber attackers secured more than 61 million retail customer records in 2014, down from almost 73 million in 2013.

When IBM narrowed its data down to only incidents involving less than 10 million customer records (which excludes the top two attacks over this timeframe, Target Corporation and The Home Depot), the number of records compromised last year increased by more than 43 percent over 2013. IBM said that cyber criminals have become more sophisticated in reaching customer records.



Traditionally, insurance agencies do not reward companies that stay out of trouble. The idea is to split the cost of compensation to a few unfortunate enterprises among the larger number of all enterprises that take out an insurance policy. Compensation is paid according to the nature of the insurance claim presented and the terms of the policy. However, it can only be made if risks can be evaluated and damage calculated. Some aspects such as damage to a company’s brand may be impossible to assess, even if they have a major negative impact. Insureds and insurers try to work with quantifiable factors. But smart enterprises know there is additional leverage to be gained when putting insurance in place.



High-profile data breaches at well-known companies such as Home Depot, Staples and Sony have shined a bright spotlight on data security, or the lack of it. But these breaches have also raised an alarm within these public companies and other organizations. Many more companies, including big IT service providers, have elevated the job of IT security to the C-level, a highly visible response to what is now a highly visible issue.

“Security jobs are being moved to the C-suite 
because the billions lost to data breaches are a C-level problem,” said Arthur Zilberman, CEO, LaptopMD.com, a New York-based computer repair company.



(TNS) — An ice storm 10 years ago proved a learning experience for some local agencies, and proof of proper preparedness for others.

The ice storm of 2005 left more than 75,000 residents without power for several days, killed four people and devastated the city and county.

Looking back, Russ Decker, director of the Allen County Emergency Management Agency, said the neat thing about the storm was Allen County’s actions after it.

“When it was over, the first thing everybody wanted to do was get together and figure out what we can do” better next time, he said.

The results left more municipalities and county agencies ready in case there is a repeat of 2005’s disaster.



After deciding to focus its efforts squarely on the mainframe at the end of 2014, Compuware is starting 2015 off with the launch today of Topaz, a data virtualization framework that makes mainframe data more accessible.

Compuware CEO Chris O’Malley says that with the vast amounts of enterprise data that reside on the mainframe, one of the core challenges organizations face is finding ways to make that information accessible to the entire organization. Topaz, says O’Malley, provides a layer of abstraction that makes that data accessible without having to intimately understand how, for example, a COBOL application was constructed.

O’Malley says Topaz will enable IT organizations that still depend on mainframes to run their most mission-critical applications to introduce more flexibility by not only making that data available via a single user interface, but also enabling users to copy that data using a simple drag-and-drop file transfer utility.



Increased supercomputing capacity will improve accuracy of weather forecasts

DSCOVR mission logo. (Credit: NOAA)

NOAA's supercomputer upgrades will provide more timely, accurate weather forecasts. (Credit: istockphoto.com)

Today, NOAA announced the next phase in the agency’s efforts to increase supercomputing capacity to provide more timely, accurate, reliable, and detailed forecasts. By October 2015, the capacity of each of NOAA’s two operational supercomputers will jump to 2.5 petaflops, for a total of 5 petaflops – a nearly tenfold increase from the current capacity.

“NOAA is America’s environmental intelligence agency; we provide the information, data, and services communities need to become resilient to significant and severe weather, water, and climate events,” said Kathryn Sullivan, Ph.D., NOAA’s Administrator. “These supercomputing upgrades will significantly improve our ability to translate data into actionable information, which in turn will lead to more timely, accurate, and reliable forecasts.”

Ahead of this upgrade, each of the two operational supercomputers will first more than triple their current capacity later this month (to at least 0.776 petaflops for a total capacity of 1.552 petaflops). With this larger capacity, NOAA’s National Weather Service in January will begin running an upgraded version of the Global Forecast System (GFS) with greater resolution that extends further out in time – the new GFS will increase resolution from 27km to 13km out to 10 days and 55km to 33km for 11 to 16 days. In addition, the Global Ensemble Forecast System (GEFS) will be upgraded by increasing the number of vertical levels from 42 to 64 and increasing the horizontal resolution from 55km to 27km out to eight days and 70km to 33km from days nine to 16.

Computing capacity upgrades scheduled for this month and later this year are part of ongoing computing and modeling upgrades that began in July 2013. NOAA’s National Weather Service has upgraded existing models – such as the Hurricane Weather Research and Forecasting model, which did exceptionally well this hurricane season, including for Hurricane Arthur which struck North Carolina. And NOAA’s National Weather Service has operationalized the widely acclaimed High-Resolution Rapid Refresh model, which delivers 15-hour numerical forecasts every hour of the day.

“We continue to make significant, critical investments in our supercomputers and observational platforms,” said Louis Uccellini, Ph.D., director, NOAA’s National Weather Service. “By increasing our overall capacity, we’ll be able to process quadrillions of calculations per second that all feed into our forecasts and predictions. This boost in processing power is essential as we work to improve our numerical prediction models for more accurate and consistent forecasts required to build a Weather Ready Nation.”

The increase in supercomputing capacity comes via a $44.5 million investment using NOAA's operational high performance computing contract with IBM, $25 million of which was provided through the Disaster Relief Appropriations Act of 2013 related to the consequences of Hurricane Sandy. Cray Inc., headquartered in Seattle, plans to serve as a subcontractor for IBM to provide the new systems to NOAA.

“We are excited to provide NOAA’s National Weather Service with advanced supercomputing capabilities for running operational weather forecasts with greater detail and precision,” said Peter Ungaro, president and CEO of Cray. “This investment to increase their supercomputing capacity will allow the National Weather Service to both augment current capabilities and run more advanced models. We are honored these forecasts will be prepared using Cray supercomputers.”

"As a valued provider to NOAA since 2000, IBM is proud to continue helping NOAA achieve its vital mission," said Anne Altman, General Manager, IBM Federal. "These capabilities enable NOAA experts and researchers to make forecasts that help inform and protect citizens. We are pleased to partner in NOAA's ongoing transformation."

NOAA's mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on TwitterFacebookInstagram and our other social media channels. Visit our news release archive.

Tuesday, 06 January 2015 00:00

Winter Weather and Cat Losses

With frigid temperatures and snow expected to fall around the New York City area and other parts of the United States this week, it’s a good time to review how winter storms can impact catastrophe losses.

For insurers, winter storms are historically very expensive and the third-largest cause of catastrophe losses, behind only hurricanes and tornadoes, according to the I.I.I.

Despite below average catastrophe losses overall in 2014, insured losses from winter storms were significant. In fact winter storms in the U.S. and Japan accounted for two of the most costly insured catastrophe losses in 2014.

According to preliminary estimates from sigma, extreme winter storms in the U.S. at the beginning of 2014 caused insured losses of $1.7 billion, above the average full-year winter storm loss number of $1.1 billion of the previous 10 years.



When it comes to mobile computing MSPs should be gearing up for a lot more complexity going into 2015. For all practical purposes usages of mobile computing devices has been fairly limited to accessing email and using browsers to surf the web. But by the end of this year most employees will probably have as many five to ten applications developed by the companies they work for running on their devices. For MSPs that means developing a capability to manage mobile applications, not just the devices they run on, will be critical requirements in 2015.

According to Phil Redman, vice president of mobile solutions and strategy for Citrix, mobile applications almost by definition will be accessing a mix of backend service running on premise and in the cloud. As such, IT organizations will be looking to work with MSPs that not only have application management expertise, but also familiarity with the entire scope of their enterprise IT operations.



I’m back at my desk after a relaxing holiday vacation. It was a pretty quiet time for cybersecurity, too. The only really disturbing news I saw during my holiday involved a data breach at Chick-fil-A and the new theory that the Sony breach likely wasn’t done by North Korea but by an insider (but then again, some of us were questioning insider involvement from the beginning).

You and I know too well that this little lull in cybersecurity news won’t last very long, but I do think that this is a good time for companies to review their cybersecurity procedures and policies. We saw the damage from the fallout after the Sony incident and I think Target is still picking up the pieces from its breach a year ago.

Near the end of 2014, Ponemon released a study, “2014 Cost of Cyber Crime Study: United States,” that shows just how expensive and damaging a breach can be: It revealed that it can cost upwards of $20,000 a day for incidents that may take, on average, a month to fix. Jon Oberheide of Duo Security pointed out that SMBs need to be especially concerned about these breach costs, telling me in an email:

While the mega-breach-du-jour gets the most media attention, Ponemon's study calls out an important distinction: The impact of breaches is much greater on small and medium businesses than the large enterprises. The real challenge in cybersecurity is how to protect the millions of businesses who don't have an enormous security budget or a large roster of top security talent to defend their organization. And yet, they face the same attacks and adversaries as the big guys. So while companies like Sony face dramatic consequences in the short-term, they will rebuild, recover, and revisit their security strategy to continue their operations in the long-term. But if you're not a Sony-scale company ...you may just have your business effectively wiped out.



Tuesday, 06 January 2015 00:00

Are We Closing in on the Quantum Enterprise?

The prevailing narrative in enterprise circles these days is that things will keep getting bigger: Big Data, regional data centers, hyperscale … everything is aimed at finding the magic formula that allows organizations to deal with larger workloads at less cost.

It is ironic, then, that one of the ways researchers are hoping to tackle this problem is by shrinking the basic computing elements – processing, storage and networking – to atomic and even sub-atomic levels in order to derive greater power and efficiency from available resources.

So-called quantum computing (QC) has been a facet of high-performance architectures for some time, but lately there has been steadily increasing buzz about enterprise applications as well.



People who manage a functional department or a business process may find it tough to set recovery objectives for what they manage so devotedly, day in and day out. That does not necessarily mean that they are not objective. Instead, they may not know how critical their part of the business is to the rest of the organisation. Without a measuring stick, they cannot confidently make recommendations or requests about suitable recovery times. So when the next business continuity planning moment comes along, BC managers may find that they have some handholding and educating to do to bring different organisational units up to speed.



Tuesday, 06 January 2015 00:00

From the Extreme to the Mean

By 2050, most of the US coast can expect to see 30 or more days a year of floods up to two feet above high tide levels, says a new NOAA study.

The study, ‘From the Extreme to the Mean: Acceleration and Tipping Points for Coastal Inundation due to Sea Level Rise’, has been published in the American Geophysical Union’s online peer-reviewed journal Earth’s Future.

NOAA scientists Sweet and Joseph Park established a frequency-based benchmark for ‘tipping points,’: when so-called nuisance flooding, defined by NOAA’s National Weather Service as between one to two feet above local high tide, occurs more than 30 or more times a year.

Based on that standard, the NOAA team found that these tipping points will be met or exceeded by 2050 at most of the US coastal areas studied, regardless of sea level rise likely to occur this century. In their study, Sweet and Park used a 1.5 to 4 foot set of recent projections for global sea level rise by year 2100 similar to the rise projections of the Intergovernmental Panel for Climate Change, but also accounting for local factors such as the settlement of land, known as subsidence.



The BCI has published an updated version of its guide to business continuity legislation, regulation, standards and guidance around the world.

Although not completely comprehensive the guide is probably the best available currently.

The guide starts by listing current and projected international initiatives, particularly those supported by the International Standards Organization (ISO), The European Union (EU) and the Basel Committee on Banking Supervision.

Each entry is categorized into one of four headings:

Legislations: government laws which include aspects of business continuity management by name or are sufficiently similar in nature (disaster recovery, emergency response, crisis management) to be treated as BCM legislation. To be included in this category they must be legally enforceable legislation passed by a national, federal, state or provincial government.

Regulations: Mandatory rules or audited guidance documents from official regulatory bodies.

Standards: Official standards from national (and international) accredited standards bodies which relate to business continuity as a whole or to a specific related subset such as IT service continuity.

Good practice: Guidelines published as good (or best) practice by various authoritative bodies.

Obtain the document.

Policy uncertainty at home and economic and geopolitical risks overseas are the central challenges facing chief financial officers (CFOs) of the UK’s largest companies as they enter 2015, according to a survey by Deloitte.

Deloitte’s latest CFO Survey gauged the views of 119 CFOs of FTSE 350 and other large private UK companies. It found that risk appetite among CFOs fell in Q4 2014. 56 percent of CFOs say that now is a good time to take greater risk onto their balance sheets, down from a record reading of 71 percent in Q3 2014 but still well above the long-term average. The change was driven by concerns over political and economic risk uncertainties: when asked to rate the level of risk posed between 0 and 100, CFOs attached a 63 rating to the UK General Election and 56 to deflation and weakness in the Euro area and to a possible referendum on the UK’s membership of the EU. The level of risk posed by each factor has risen in the last three months. 60 percent of CFOs enter 2015 with above normal, high or very high levels of uncertainty facing their businesses, up from a low of 49 percent in Q2 2014 but at the same level seen 12 months ago.

Ian Stewart, chief economist at Deloitte, said: “The central challenges facing the UK’s largest companies as they enter 2015 are policy uncertainty at home and economic and geopolitical risks overseas. Rising levels of uncertainty have caused a weakening of corporate risk appetite which, nonetheless, remains well above the long-term average.”



According to preliminary estimates, total economic losses from natural catastrophes and man-made disasters were USD 113 billion in 2014, down from USD 135 billion in 2013. Out of the total economic losses, insurers covered USD 34 billion in 2014, down 24 percent from USD 45 billion in 2013. This year disaster events have claimed around 11 000 lives.

Of the estimated total economic losses of USD 113 billion in 2014, natural catastrophes caused USD 106 billion, down from USD 126 billion in 2013. The outcome is well below the average annual USD 188 billion loss figure of the previous 10 years. The total loss of life of 11,000 from natural catastrophe and man-made disaster events this year is down from the more than 27,000 fatalities in 2013.

Insured losses for 2014 are estimated to be USD 34 billion, of which USD 29 billion were triggered by natural catastrophe events compared with USD 37 billion in 2013. Man-made disasters generated the additional USD 5 billion in insurance losses in 2014.



Winter has officially begun, and the next few months could create both opportunities and challenges for many managed service providers (MSPs).

While winter can bring snow, sleet and other inclement weather, MSPs can provide data backup and disaster recovery (BDR) solutions to help businesses safeguard data and ensure that companies can access this information in any conditions.

And with December quickly drawing to a close, and winter weather on the horizon in cities and towns across the country, now is the perfect time to review this month's top BDR lessons for MSPs.



Monday, 05 January 2015 00:00

A New Year to Prepare

It is that time of year again, a time to reflect on another year gone by and prepare for the new year to come. It is time to dust off last year’s resolutions and come up with a new list of things to accomplish in 2015. While researching the latest diet trend and signing up for the newest exercise class or in between swearing off your guilty pleasures, vowing to set your alarm earlier, and promising to be better at staying in touch, do yourself a favor and add these five simple preparedness resolutions to your list.

1.  Make or update your emergency kit.

If you don’t have an emergency preparedness kit in your house and car, it’s time to get one.Hurricane Kit

Gather water, food, flashlights, batteries, and a first aid kit into a container or bag and store it in an easy-to-access area of your house or car.

If you already have an emergency kit, take time to review what is in it. Does your extra pair of clothes still fit? Do the flashlights need new batteries? Are all your important documents up to date? Having an emergency kit in your home or car will not be of use during an emergency if your kit is out of date or missing adequate supplies.

For more information on what to include in your emergency kit, visit CDC’s webpage: http://emergency.cdc.gov/preparedness/kit/disasters/.

2.  Form a support network (talk to your neighbors).

New Year’s Eve parties are a great time to catch up with friends and family. Why not use this time surrounded by those you love to talk about preparing for an emergency? Talk to your neighbors about forming a support network and make a plan to check on each other after a disaster occurs. Talk to people close to you about any physical limitations or special medical needs you may have during an emergency. During an emergency it is usually the people in closest proximity that are first to offer aid, and while it may not be the typical topic of conversation at your New Year’s Eve bash, it is an important discussion to have.

3.  Prepare your family (older adults, kids and pets).

When making all your plans to prepare, don’t forget your family. Talk to older adults in your life about their emergency preparedness plans, and ask them how you can help. Make sure your kids are involved in your emergency preparedness planning. Help them understand and be part of natural disaster planning with CDC’s Ready Wrigley. Also, don’t forget your pets. Include food and water for your furry friends in your emergency kit, and identify pet friendly evacuation shelters in your area.

4.  Join an alert network (app, weather radio, email updates).

It’s 2015 and even though we may not have flying cars or time machines, we do have some great technology for tracking and alerting us to natural disasters that may be in our area. Rather than downloading the latest video game or dating app, make sure your phone and computer have alert systems set up to notify you when dangerous weather is in your area. Consider setting up push notifications or email alerts that let you know when a natural disaster may be coming.

5.  Weatherize your home and review your insurance.

mature couple fills in questionnaire together 

The New Year is a perfect time to review your insurance plan and evaluate your home. Install or check smoke detectors and carbon-monoxide alarms in your house. Make sure you know where the utility off and on switches are located. During leaks or when evacuating your home, knowing how to turn off your gas, water, and electricity could help prevent damage to your home and protect your health. Also, check your insurance policy and make sure you are covered for possible flooding or structural damage to your home and property.

Taking time to prepare for emergencies and natural disasters now could be the most important thing you do this year.


A recent survey from antivirus software provider ThreatTrack Security showed that 81 percent of IT security professionals said they would "personally guarantee that their company's customer data will be safe in 2015."

The ThreatTrack Security survey, titled "2015 Predictions from the Front Lines," also revealed 94 percent of respondents said they are optimistic that their organization's ability to prevent data breaches will improve next year.



Monday, 05 January 2015 00:00

The cloud in 2015

Steven Harrison predicts how business use of cloud computing will develop and change during the next 12 months:

Hybrid is the equaliser

Whilst cloud computing has become an integral part of IT systems, concerns around vendor lock-in, licensing restrictions and security mean that businesses are still resistant to moving all IT operations into a hosted environment. As a result, the hybrid cloud will become the deployment model of choice for those organizations that want to leverage the elasticity of the cloud in tandem with existing infrastructure. The challenge for organizations adopting a hybrid approach is ensuing that systems can run in parallel and operate as one environment to guarantee performance uptime.



Monday, 05 January 2015 00:00

Cybersecurity predictions for 2015

Proofpoint looks at how information security threats are likely to evolve during the coming year.

2014 was a year in which information security vaulted into the public eye, driven by a surge in both the number and the visibility of data breaches and compromises. This new attention will bring greater scrutiny in 2015, just as the nature and severity of threats continue to evolve for the worst.

Cyberextortion will be the most rapidly growing new threat family

Beginning with the rapid rise of CryptoLocker in late 2013, the threat from ransomware expanded rapidly in 2014, adding not only other ‘extortion malware’ but also spreading to mobile platforms such as Android. Paying the ransom remains arguably a popular option despite its risks, and the estimated $3 million in ransoms generated by CryptoLocker alone has shown cybercriminals the revenue potential of digital extortion schemes. These attacks are difficult to defend against and costly to recover from, and lead to business disruption that extends far beyond the loss of data.



By Andrew Hiles

Service level agreements (SLAs) and business continuity go hand-in-hand: or they should do!

Whether SLAs are implemented in support of a balanced scorecard to align information and communications technology with business mission achievement, or as a stand-alone initiative, the strategic use of service level agreements can be a perfect solution to the justification of investment in resilience and business continuity: an approach I have been advocating for over ten years.

How does it work?

First, define the business mission.

Take, as an example, a multinational company – call it Klenehost - selling miniature packs of soap, shampoo, hair conditioner and shower gel to the hotel industry. These are packaged in different ways and customized for specific hotel chains.



By Rachel Weingarten

Some brands stay fresh and relevant generation after generation. What makes certain corporate branding strategies timeless while others come and go?

Take Brooks Brothers. No less a person than Abraham Lincoln was one of the brand’s most loyal customers. So how does a nearly 200-year-old company not only stick around, but remain relevant and even cutting-edge?



Trapp Technology has unveiled disaster recovery (DR) services that are designed to deliver physical or virtual data replication, redundant connectivity and high availability of IT infrastructure during downtime recovery.

The Scottsdale, Arizona-based managed service provider (MSP) said its DR services can instantly initiate seamless recovery of applications and data.

"Our clients consistently asked us to assist them with more of their technology needs, disaster recovery being one of the most common requests," DJ Jones, Trapp's vice president of sales and marketing, told MSPmentor. "With the high demand for disaster recovery services, we made sure [these were] a priority."



Business Continuity planning and maintenance cycles often leave little time and few resources for planning how the organization will react if the unexpected actually occurs (their Incident Response).

An analogy can be drawn between healthcare and Business Continuity Management: planning and plan exercise cycles are analogous to maintaining a healthy lifestyle and having regular medical checkups.  And when something serious occurs, the medical care system is prepared to react.  So should your BCM program.



By Natalie Burg

“You’ll shoot your eye out!”

Just when you thought that much-loved line couldn’t mean any more or less than it did the last 500 times you heard it, the popular movie A Christmas Story includes some business lessons you may have overlooked.

Business lessons in a holiday movie? You bet your Red Ryder, carbine action, 200-shot, range model air rifle. In this post we revisit good ol’ Cleveland Street to uncover five business lessons that can be learned from the cinematic classic.



Tuesday, 23 December 2014 00:00

Data Center 2015: Where Do We Go from Here?

Tis the season for year-end wrap-ups and year-ahead predictions, so as in past years I will take a look at what some of the key industry players are saying and then offer my own take as to what looks real and what looks imaginary.

One of the broadest discussions of late is the future of the data center itself. As virtualization, the cloud and software-defined architectures gain in popularity, it is not hard to imagine a software-defined data center (SDDC) consisting of an end-to-end data environment sitting entirely atop the virtual layer with nearly all hardware, save the client device outsourced to a third-party provider.

This is part of what IDC describes as the 3rd Platform of innovation and growth. Accompanied by advances like mobile computing, Big Data analytics and social networking, the 3rd Platform characterizes what the firm says is the “new core of ICT market growth” and is already responsible for about a third of the total IT spend. For 2015, IDC expects raw compute and storage capacity to shift to cloud-based resources optimized for mobile and Big Data applications, and this will lead to the rise of “cloud first” hardware development – particularly consolidated solutions that cater to hyperscale infrastructure.



In 2015, cybercriminals will increasingly be non-state actors who monitor and collect data through extended, targeted attack campaigns, McAfee Labs predicts. In the group’s 2015 Threats Predictions, Intel Security identified internet trust exploits, mobile, internet of things and cyber espionage as the key vulnerabilities on next year’s threat landscape.

“The year 2014 will be remembered as ‘the Year of Shaken Trust,’” said Vincent Weafer, senior vice president of McAfee Labs. “This unprecedented series of events shook industry confidence in long-standing Internet trust models, consumer confidence in organizations’ abilities to protect their data, and organizations’ confidence in their ability to detect and deflect targeted attacks in a timely manner. Restoring trust in 2015 will require stronger industry collaboration, new standards for a new threat landscape, and new security postures that shrink time-to-detection through the superior use of threat data. Ultimately, we need to get to a security model that’s built-in by design, seamlessly integrated into every device at every layer of the compute stack.”

McAfee Labs predicts the top cybersecurity threats in 2015 will be:



Information technology downtime is a costly proposition. Based on industry surveys, it can cost an organization as much as $5,600 a minute, or well over $300,000 per hour in losses, according to IT research firm Gartner. But the costs and complexities of traditional approaches to disaster recovery can be expensive too, especially for smaller jurisdictions. As a result, some cities are leveraging the cloud to provide a cost-effective way to maintain services in the event of a local or regional emergency.

Asheville, N.C., historically maintained its data center redundancy through a local disaster recovery center, located just two blocks from the city’s primary data center. But when Asheville CIO Jonathan Feldman came on board, that scenario made him uncomfortable. 

“Anything that can take out City Hall can probably impact something that’s two blocks away as well,” he said. “It was sort of a thorn in my side. But disaster recovery is not the easiest thing to get money for, so we struggled a bit to find a solution.”



They say that information drives business. Actually, it’s electricity. Your data will most likely be useless if you have no power. On the other hand, if you can turn the lights on, you can start working, one way or another. But now in a kind of millennial Mobius loop, information is also increasingly driving power distribution. Smart grids are a case in point. The benefits are in higher power transmission efficiency, reduced costs, better peak load handling and better integration of customer-owned generating systems. The risk is in the network security.



In 2013 Continuity Central conducted a survey to explore quality control methods that are being used within business continuity management systems. This survey has now been repeated to see how the trends in this area have changed.

The 2014 Quality control and measurement of business continuity management systems survey was conducted online using SurveyMonkey and received 142 responses in total. 84.5 percent of respondents were from large organizations (those with more than 250 employees). Respondents came from around the world, with the most coming from the US (34 percent), the UK (25 percent) and Australia (6 percent).

The survey initially asked “Does your organization have clear processes or methods for the quality control of business continuity plans and systems?” 66.9 percent of respondents said that, yes, their organization did have clear processes or methods; while 29.6 percent said no their organization didn’t. This was a very similar result to the 2013 survey, where 64.9 percent answered ‘yes’ and 30.2 percent answered ‘no.’



Few things are as integral to the data center as the server. I know, technically it is only one leg of the three-legged data stool, along with storage and networking, but the server is where the actual processing takes place – the brains of the operation, so to say – so it is understandable that data executives are a little apprehensive about the divergent path that server development is taking.

On the one hand, mainframes still run a fair amount of the enterprise load, despite numerous calls for the technology to be put out to pasture. At the same time, blade and microservers are showing that they are equally adept at handling massive data loads, particularly when it comes to the parallel processing of multiple data streams that characterize modern Web-facing applications.

In between, there is a plethora of high-power, medium-power and low-power solutions, not to mention the rise of modular infrastructure that could make the whole idea of disparate systems obsolete. So it is no wonder the data center executive is having trouble seeing the future.



As the Ebola outbreak in West Africa led many to be concerned about U.S. capability to respond to its infectious disease threats, an annual report shows only half of states score well on 10 key public health measures.   

Many states scored poorly on measures of communication and coordination responses to threats, vaccination rates and infections from contact with the health care system, according to the report, released annually by the Robert Wood Johnson Foundation and Trust for America’s Health. 

"Over the last decade, we have seen dramatic improvements in state and local capacity to respond to outbreaks and emergencies,” said Jeffrey Levi, executive director of the Trust, in a statement. “But we also saw during the recent Ebola outbreak that some of the most basic infectious disease controls failed when tested.”



In 2015, almost every CIO will be tasked with assessing their organizations and technology to ensure data and confidential information is protected.

Current Situation

Target, Home Depot, Staples, who’s next? These are just the most recent retail outlets that made the news. What is not making the headlines are the multitude of private- and public-sector organizations that have been hacked and lost data and information — many times totally unaware until after the fact.



Monday, 22 December 2014 00:00

Data Center Efficiency: Look Before You Leap

Efficiency in the data center is a big thing now, with organizations of all sizes working to develop both the infrastructure and the practices that can help lower the energy bill. But while analysis of data flows and operating characteristics within equipment racks is fairly advanced, the ability to peek under the covers to see how energy is actually being used is still very new.

To be sure, there is a variety of tools on the market these days, from simple measurement devices to full Data Center Infrastructure Management (DCIM) platforms, but more often than not the question revolves around not only what to measure, but how.

Without adequate insight into what is going on, it is nearly impossible to execute an effective energy management plan, says UK power efficiency expert Dave Wolfenden. Many standard tests, in fact, fail in this regard because they attempt to gauge the upper capabilities of power and cooling equipment, not how to maintain maximum efficiency during normal operation. New techniques like computational fluid dynamics (CFD) can help in this regard, but they must be employed with proper baselines in order to give a realistic indication of actual vs. projected results.



Monday, 22 December 2014 00:00

2015: The Year of Agile Data Warehousing

2015 will be the year that agile data warehouse (DW)/business intelligence (BI) takes off.  Traditional strategies for DW/BI have been challenged at best, with the running joke being that a DW/BI team will build the first release and nobody will come. On average, Agile strategies provide better time to market, improved stakeholder satisfaction, greater levels of quality, and better return on investment (ROI) than do traditional strategies. The DW/BI community has finally started to accept this reality, and it is now starting to shift gears and adopt agile ways of working. My expectation is that 2015 will see a plethora of books, case studies, and blog postings describing people’s experiences in this area.



WatchGuard Technologies is urging organizations to use the nearly epic scale of the Sony cyber attacks to spur their companies into action versus panicking about potential risks.

"A year ago, we predicted major state-sponsored attacks may bring a Hollywood movie hack to life that exploits a flaw against critical infrastructure – we just didn't predict it would happen to Hollywood itself," said WatchGuard's Global Director of Security Strategy, Corey Nachreiner. "It's important that IT pros use this opportunity to upgrade what is often five-year-old technology to defend against five-day-old threats."

"The FBI is right when it says that less than 10 percent of companies could survive an attack like the one on Sony," continued Nachreiner. "And, unfortunately, it's not a question of if, but when for these kinds of attacks."

Nachreiner recommends five immediate actions that organizations can take to make sure they have the best possible chance of preventing attacks, and seven actions to minimize damage if cyber criminals do get in:



The Sony hacking and subsequent threats to the company and its supply chain, has become the biggest information story of 2014; in a year of many high profile incidents. What started out as ‘yet another breach story’ a few weeks ago rapidly developed into a very real business continuity and reputation threatening incident.

On December 19th the FBI published an update on the Sony cyber attack. The highlights include:



Insurance companies face strict business uptime, data management and data protection requirements, and as a result, these businesses need data backup and disaster recovery (BDR) and business continuity solutions that fulfill these needs.

However, managed service providers (MSPs) can offer data BDR and business continuity solutions with image-based backup to help insurance companies back up files, programs and other important information quickly and easily.



Have you ever thought about all the information your appliances tell you? The world is moving toward presenting instant data about every aspect of life. For example, there is now an electric toothbrush with Bluetooth capabilities that can record your brush strokes and let you chart your dental hygiene activities on a smartphone app. Home sensor products not only tell you if your teenager is trying to sneak out at night, but also how many times someone has been dipping into the cookie jar. And many of us can’t even exercise anymore without a fitness band and apps that record every step, every calorie expended, and every turn in our sleep.

While some of that real-time data is great to have, we’re also reaching a point of TMI … “too much information,” or data overload. How much is too much real-time data? Only you can answer that for your personal data needs, but I do know there is one area where there is never enough real-time data. That is in your company’s disaster recovery plan.

Think about a disaster striking your business. You could have all your subject matter experts in place, but if they can’t access data or if your recovery strategy isn’t complete, nothing will work. The consequences could be nothing short of catastrophic: for the vast majority of companies, once they have to shut down because of server problems or another disaster, they aren’t able to recover in a timely fashion. And let’s face it … a faltering or incomplete recovery can spell death for a business.



It is fascinating to watch a new class of software be born. This doesn’t seem to happen that often anymore, but every once in a while a customer or a vendor discovers a gap in the current offerings and fills that gap with something we have never seen before. I recently ran into an event like this at BMC Engage. BMC has a write-up that subtly points to the impending creation of this new security automation product class. And last week, I spoke to Tony Stevens, who works for the Department of Technology, Management and Budget at the State of Michigan and is helping husband the birth of this class. Let’s talk about that this week.



(TNS) — Think the Napa fault stopped moving after producing a 6.0 earthquake in August? Think again.

The fault that caused that Napa quake is forecast to move an additional 2 to 6 inches in the next three years in a hard-hit residential area, a top federal scientist said at a meeting of the American Geophysical Union in San Francisco on Tuesday.

It is the first time scientists have formally forecast the gradual shifting of the ground in a residential area after an earthquake.

“Until the South Napa earthquake happened, we had not clearly foreseen just what a problem that could be,” U.S. Geological Survey geophysicist Ken Hudnut said.



Security pros got the Target breach for Christmas last year. The breach hit the retailer during its busiest time of the year and cost them millions in lost business. For security pros desperate for more budget and business prioritization, you couldn’t have asked for a more perfect present - it’s as is if Santa himself came down the chimney and placed a beautifully wrapped gift box topped with a bow right under your own tree. This year it looked as if all we were getting was a lump of coal - but then Sony swooped in to save us like a Grinch realizing the true meaning of Christmas.
The Sony Picture Entertainment (SPE) breach is still unfolding, but what we know so far is that a hacktivist group calling themselves the Guardians of Peace (GoP) attacked Sony in retribution for the production of a movie, “The Interview,” which uses the planned assassination of North Korea’s leader as comedic fodder. The hacktivists supposedly stole 100 TBs of data that they are gleefully leaking bit by bit (imagine Jingle Bells as the soundtrack). The attack itself affected the availability of SPE’s IT infrastructure, forcing the company to halt production on several movies.
We’ll be releasing a more detailed analysis for clients later this afternoon, but at a high level, there are several reasons why this attack is in the news every day, why it will prove to be yet another turning point in the security industry, and why security is so integral to the business technology (BT) agenda:
Friday, 19 December 2014 00:00

How to Turn Open Data into Real Money

I recently interviewed a technology start-up that claimed they were already profitable, with only a few clients and a few months out the door. I have no way to verify or deny that, but I can tell you this: The entire product is built around open data.

In fact, its founders adamantly refused to let me call it a technology company, which is just one of many reasons I’m not revealing its name.

“Our product is the data,” one VP repeatedly told me.

That’s a bit of a bold claim for a company based on government-released data and other open data sets. If it were really the data, and everybody has access to the data, then what’s the point?



Friday, 19 December 2014 00:00

2014: The Perfect Malware Storm

IT security may be an MSP’s core offering or one of several lines of business. But regardless of its business model, a service provider should take stock of the current threat landscape. MSPs need to know what’s out there if they hope to help clients mitigate their security risks.

What are your customers up against? In 2014, they endured the perfect malware storm. Consider the following:



One of the side effects of the consumerization of IT is that some end customers are feeling more empowered than ever to take IT matters into their own hands rather than seek the help of IT solution providers. This is especially true when it comes to cloud services, where business owners (or their employees) can self-install a cloud backup product and instantly have access to 5 GB or more of free cloud storage. Even if business owners aren't actively involved in using or promoting DIY (do-it-yourself) cloud services, research shows their employees are. A study from Skyhigh Networks, which monitors the use of cloud services for businesses, found that the average enterprise uses 545 cloud services, which is approximately 500 more than the average CIO is aware of!

Besides the loss of control of corporate data, DIY cloud services play into the hands of cybercriminals who exploit business owners through ransomware. Like other malware, ransomware infects corporate networks through unpatched computers or when a user clicks on an infected email attachment. Once launched, the ransomware program encrypts common user files on the network--such as documents, spreadsheets and database files--and the victim is required to pay a ransom to decrypt the files.



Friday, 19 December 2014 00:00

Cyber Risk on the Inside

While the Sony cyber attack has put the spotlight on sophisticated external attacks, a new report suggests that insiders with too much access to sensitive data are a growing risk as well.

According to the survey conducted by the Ponemon Institute, some 71 percent of employees report that they have access to data they should not see, and more than half say this access is frequent or very frequent.

In the words of Dr. Larry Ponemon, chairman and founder of The Ponemon Institute:

This research surfaces an important factor that is often overlooked: employees commonly have too much access to data, beyond what they need to do their jobs, and when that access is not tracked or audited, an attack that gains access to employee accounts can have devastating consequences.”