Spring World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Fall Journal

Volume 29, Issue 4

Full Contents Now Available!

Industry Hot News

Industry Hot News (6562)

Everyone likes to get new stuff. Heck, that’s what Christmas is all about, and why it has emerged as a primary driver of the world economy.

In the data center, new stuff comes in the form of hardware and/or software, which lately have formed the underpinnings of entirely new data architectures. But while capital spending decisions almost always focus on improving performance, reducing costs or both, how successful has the IT industry been in achieving these goals over the years?

According to infrastructure consulting firm Bigstep, the answer is not very. The group recently released an admittedly controversial study that claims most organizations would see a 60 percent performance boost by running their data centers on bare metal infrastructure. Using common benchmarks like Linpack, SysBench and TPC-DC, the group contends that multiple layers of hardware and software actually hamper system performance and diminish the investment that enterprises make in raw server, storage and network resources. Even such basic choices as the operating system and dual-core vs. single-core processing can affect performance by as much as 20 percent, and then the problem is compounded through advanced techniques like hyperthreading and shared memory access.



Everyone likes to get new stuff. Heck, that’s what Christmas is all about, and why it has emerged as a primary driver of the world economy.

In the data center, new stuff comes in the form of hardware and/or software, which lately have formed the underpinnings of entirely new data architectures. But while capital spending decisions almost always focus on improving performance, reducing costs or both, how successful has the IT industry been in achieving these goals over the years?

According to infrastructure consulting firm Bigstep, the answer is not very. The group recently released an admittedly controversial study that claims most organizations would see a 60 percent performance boost by running their data centers on bare metal infrastructure. Using common benchmarks like Linpack, SysBench and TPC-DC, the group contends that multiple layers of hardware and software actually hamper system performance and diminish the investment that enterprises make in raw server, storage and network resources. Even such basic choices as the operating system and dual-core vs. single-core processing can affect performance by as much as 20 percent, and then the problem is compounded through advanced techniques like hyperthreading and shared memory access.

(MCT) — While Anniston, Ala., schools have not been the scene of the sort of firearm violence that has struck other schools around the country in recent years, district officials and others across the state are taking steps to permit a safer outcome if such a situation develops.

The tactic: To let all first responders know the layout of the school before an emergency arises. 

During the summer, detailed 3-D virtual maps were created revealing the nooks and crannies inside each of Anniston City Schools’ seven school buildings, at a cost to the district of between $2,000 and $3,000 per school, said Superintendent Darren Douthitt.



Most of us can’t imagine conducting day-to-day business without email.   Our dependence has only increased because of smart devices that keep us connected to our email 24/7.

How would your business operate if suddenly, unexpectedly, no one had access to their email?

More importantly, what would happen if – while that email outage was taking place – all incoming emails were irretrievably lost?  Would you miss business opportunities?  Could your lack of access make prospects, customers and vendors feel like you are ignoring them, don’t care about their needs (or worse)?  Do you fully understand all regulatory implications that may apply to missed communications?



Corporations spend a lot of time and money to ensure their employee- and customer-facing technologies are compliant with all local and regional data privacy laws. However, this task is made challenging by the patchwork of data privacy legislation around the world, with countries ranging from holding no restrictions on the use of personal data to countries with highly restrictive frameworks. To help our clients address these challenges, Forrester developed a research and planning tool called the Data Privacy Heat Map (try the demo version here). Originally published in 2010, the tool leverages in-depth analyses of the privacy-related laws and cultures of 54 countries around the world, helping our clients better strategize their own global privacy and data protection approaches.

The most recent update to the tool, which published today, highlights two opposing trends affecting data privacy over the past 12 months:



Companies large and small appear to have been targeted in what is being described as the largest known data breach to date.

As first reported by The New York Times, a Russian crime ring amassed billions of stolen Internet credentials, including 1.2 billion user name and password combinations and more than 500 million email addresses.

The NYT said it had a security expert not affiliated with Hold Security analyze the database of stolen credentials and confirm its authenticity.



One of the challenges with Big Data is how to find value hidden in all that volume. Experts generally recommend approaching it as an explorer rather than simply querying the data to find specific answers.

As an astrophysicist, Dr. Kirk Borne knows a thing or two about probing the unknown. Borne, professor of Astrophysics and Computational Science at George Mason University, began tinkering with large data sets because of science, but soon became an advocate for Big Data. Now, in addition to his work as a professor and astrophysicist, Borne is a transdisciplinary data scientist.

According to Borne’s May post for the MapR.blog, he has identified four major types of Big Data discoveries (data-to-discovery, he terms it):



Wednesday, 06 August 2014 17:03

The Increasingly Diverse World of Storage

The changes to storage technology have been well-documented over the years. From tape to disk to solid state, not to mention DAS, SAN/NAS and StaaS, the only constant in the storage industry has been change.

Lately, however, these technological changes are starting to coalesce in the data center to produce not only bigger and better storage, but entirely new architectures designed to address increasingly specialized workloads. This has given the enterprise unprecedented ability to craft their own storage environments, rather than simply upgrade their legacy vendor solutions.

Naturally, this is producing a fair amount of turmoil in the traditionally staid storage industry. As Redmond Magazine’s Jeffrey Schwartz notes, established firms like EMC, HP and NetApp are under increasing pressure from start-ups like Nasuni and Pure Storage who are turning to advanced Flash and memory solutions aimed specifically at mobile and cloud-based data loads. Even companies like Microsoft are moving into storage hardware as they ramp up their cloud offerings in the race to beat Amazon to the highly lucrative enterprise storage market.



Among the already scary global state of affairs, cybersecurity and critical infrastructure are also areas that have become increasingly tense. In July, The Economist ran a special section on cybersecurity, and one of the stories focused on critical infrastructure attacks. One passage explains perhaps the key issue driving the underlying threat to the world’s critical infrastructure, and it involves the way in which supervisory control and data acquisition (SCADA) systems, which control network operations, have evolved:

Many of these were designed to work in obscurity on closed networks, so have only lightweight security defences. But utilities and other companies have been hooking them up to the web in order to improve efficiency. This has made them visible to search engines such as SHODAN, which trawls the internet looking for devices that have been connected to it. SHODAN was designed for security researchers, but a malicious hacker could use it to find a target.



Tuesday, 05 August 2014 16:13

Security is not Criminology

Security is, I believe, a major contributor to organisational resilience.  It is about protecting assets from loss and damage, risk analysis and management, and alignment with organisational needs.  It’s not about criminals and criminality.  If you want to be adept and capable as a security professional, knowing about what motivates criminals is not actually of much practical utility.  Why should you be interested in ‘rational choice’ when what you need to know about are the methods required to protect your assets? Why study the nuances of criminal investigation when you are looking into the security breach that has already occurred?  Obviously, if you want to inform methods of limiting future damage then that is useful, but for me not the driving focus of security.

The functions of security have moved on rapidy from alignment to policing activities to a much wider embedded and linked function.  The security professional should be as comfortable in blending his or her functions with those of crisis and continuity management as they are in conducting risk analyses.  The security professional should be less concerned with crime rates and more with the ability to identify and manage their own vulnerabilities to all types of threat, some malicious and criminal, but many not.  The growth in security these days is of course around IT, information and cyber; and there are adversaries out there who are deeply criminal.  They no doubt hit all the spots for criminological theories; but it doesn’t matter – the cyber security professional’s role is to limit the penetration and damage whether the adversary is a kid in his bedroom or a nation-state. Or even the insider who does not understand the damage that their IT use can cause



Traditional data backup happens once every so often – once an hour, once a day, once a week, for example, depending on the recovery requirements associated with the data. It’s typically the recovery point objective or RPO that determines the frequency of the backup. If you cannot afford to lose more than the last 30 minutes’ worth of data, then your RPO will be 30 minutes and backups will happen at least every half an hour. Continuous replication on the other hand changes the model by backing up your data every time you make a change. But what does that do to RPO, disk space requirements and network capacity (assuming you’re backing up to storage in a different physical location)?



As the health of two Ebola-stricken American missionaries deteriorated late last month, an international relief organization backing them hunted for a medical miracle. The clock was ticking, and a sobering fact remained: Most people battling the disease do not survive.

Leaders at Samaritan’s Purse, a North Carolina-based Christian humanitarian group, asked officials at the Centers for Disease Control and Prevention whether any treatment existed — tested or untested — that might help save the lives of Kent Brantly and Nancy Writebol, both of whom had contracted Ebola while helping patients in Liberia.

The CDC put the group in touch with National Institutes of Health workers in West Africa, where an employee knew about promising research the U.S. government had funded on a serum that had been tested only in monkeys.



What should a business continuity plan contain? It's important to keep it concise and manageable, but I'm sure we all have our own ideas as to what the 'must have' items are. Charlie Maclean-Bristol of PlanB Consulting takes us through what he thinks the top ten features of a good plan are:

1. Scope. On many of the plans I see it is not clear what the scope of the plan is. The name of the department may be on the front of the plan but it is not always obvious whether this is the whole of the department, which may cover many sites, or just the department based in one location. It should also be clear within strategic and tactical plans what part of the organisation the plan covers. Or does it cover the whole of the organisation? Where large organisations have several entities and subsidiaries it should be clear whether the tactical and strategic plans cover these.



(MCT) — With dozens of local doctors and medical staff among the dead, U.S. and foreign experts are preparing to flood into West Africa to help fight the deadliest Ebola outbreak on record.

Although two Americans, Dr. Kent Brantly and health worker Nancy Writebol, have contracted the disease, health experts say foreigners taking careful precautions should not be at serious risk.

But more than 60 local medical staff, about 8 percent of the fatalities, have died in Sierra Leone, Liberia and Guinea — poor countries with weak, overloaded health-care systems that are ill-equipped to handle the outbreak.

Ebola expert G. Richards Olds, dean of medicine at UC Riverside, compared local health-care workers there to doctors who donned beaked masks, leather boots and long, waxed gowns to fight the plague in medieval Europe.



The harmful toxin found in Lake Erie that caused a water crisis in Ohio's fourth-largest city this weekend has raised concerns nationally. That's because no states — including Texas — require testing for such toxins, which are caused by algal blooms. And there are no federal or state standards for acceptable levels of the toxins, even though they can be lethal.

In Toledo, Ohio, where voluntary tests at a water treatment plant found elevated levels of the toxin microcystin, which is produced by blue-green algae, the city is urging residents and the several hundred thousand people served by its water utility not to drink tap water, even if they boil it. Exposure to high levels of microcystin can cause abdominal pain, vomiting and diarrhea, liver inflammation, pneumonia and other symptoms, some of which are life-threatening. Restaurants have closed and there are shortages of bottled water as far as 100 miles away.

In Texas, which has battled blue-green algae problems at several of its lakes, Terry Clawson, the spokesman for the state's Commission on Environmental Quality, said surface water data has "not demonstrated levels of algal toxins that show any cause for alarm."



(MCT) — Hotshot Hollywood directors make movies about machines that can predict the future and software programs that can peer ahead in time. Silver screen villains plot to use the predictive power for evil; heroes fight for good.

The drama makes for great movies, but it's not all science fiction: the Tennessee Highway Patrol is already using that kind of technology every day.

It's called predictive analytic software. And it could be the start of a whole new generation of traffic safety, a new tool as revolutionary as seat belts or radar.

"It's the coming thing," said Tennessee Highway Patrol Colonel Tracy Trott.

Tennessee Highway Patrol analysts plug all sorts of factors into the software — like weather patterns, special events, home football schedules, festivals and historic crash data — and the program spits out predictions of when and where serious or fatal traffic accidents are most likely to happen.



ABUJA, Nigeria — In an ominous warning as fatalities mounted in West Africa from the worst known outbreak of the Ebola virus, the head of the World Health Organization said on Friday that the disease was moving faster than efforts to curb it, with potentially catastrophic consequences, including a “high risk” that it will spread.

The assessment was among the most dire since the outbreak was identified in March. The outbreak has been blamed for the deaths of 729 people, according to W.H.O. figures, and has left over 1,300 people with confirmed or suspected infections.

Dr. Margaret Chan, the W.H.O. director general, was speaking as she met with the leaders of the three most affected countries — Guinea, Liberia and Sierra Leone — in Conakry, the Guinean capital, for the introduction of a $100 million plan to deploy hundreds more medical professionals in support of overstretched regional and international health workers.

“This meeting must mark a turning point in the outbreak response,” Dr. Chan said, according to a W.H.O. transcript of her remarks. “If the situation continues to deteriorate, the consequences can be catastrophic in terms of lost lives but also severe socioeconomic disruption and a high risk of spread to other countries.”



Summer vacation: Isn’t it great? Except it is not what it used to be. We are either expected by our employers and clients to somehow remain accessible and productive 24/7 while we’re “off,” or we put that pressure on ourselves. Or we’re in the middle of a job search and don’t want to lose precious momentum or appear not to be serious.

Taking needed vacation time in order to relax and recharge can be especially difficult for those working in IT. A Computerworld piece that is filled with seriously depressing anecdotes about IT folks working through vacation cites a 2014 TEKsystems survey that “found that 47% of senior IT professionals are expected to be available 24x7 while on vacation (up from 44% in 2013), compared to 18% of entry- to mid-level IT professionals (a decrease from 20% in 2013).”

Here are ideas from IT Business Edge and elsewhere for how to manage the expectations, stress, extra duties and communication challenges that your wonderful vacation now brings.



The debate has been going on for a long time. Is it Business Continuity for business processes and Disaster Recovery for IT? Is Business Continuity just the current term for any preparedness planning going on in the organization? Does it depend on who is the driving force behind the need to create a plan? Was it IT, a business line, Audit or Risk Management that got it started? One thing for sure is that in most companies the people on either side of the fence don’t often talk to each other. And it has been that way for years.

When I did an internet search on the topic of Business Continuity vs Disaster Recovery, I found posts going back many years. Just last year (August 27, 2013) Jim Mitchell posted a blog that said, “Unless and until IT and ‘the business’ work together as equal partners in the development of comprehensive Business Continuity, we haven’t moved into a truly ‘post-DR’ world.  As long as the two extremes see themselves as adversaries, they are unlikely to reach true Business Continuity objectives.  As long as they fight separately over the same budget dollars (and we all know who usually wins that battle), they will never truly be partners in organization recoverability.” A year later this is still true.



The Director-General of WHO and presidents of west African nations impacted by the Ebola virus disease outbreak will meet Friday in Guinea to launch a new joint US$100m response plan as part of an intensified international, regional and national campaign to bring the outbreak under control.

The scale of the ongoing outbreak is unprecedented, with approximately 1,323 confirmed and suspected cases reported, and 729 deaths in Guinea, Liberia and Sierra Leone since March 2014.

“The scale of the Ebola outbreak, and the persistent threat it poses, requires WHO and Guinea, Liberia and Sierra Leone to take the response to a new level, and this will require increased resources, in-country medical expertise, regional preparedness and coordination,” says Dr Chan. “The countries have identified what they need, and WHO is reaching out to the international community to drive the response plan forward.”



When it comes to business continuity and disaster recovery planning, hope is not a strategy. IT departments, however, are too often surprised by the inevitable when a disaster they could have seen coming changes everything. Even companies that have a good disaster recovery or even disaster recovery as a service (DRaaS) plan in place aren't immune to significant business disruptions; they may think their company is fully protected, but Logicalis US warns that having a disaster recovery plan alone may be putting the proverbial cart before the horse.

The horse, in this case, is developing a solid business continuity strategy first.

"Disaster recovery – even DR as a Service – is technology based. The technology will save whatever data you tell it to, but the success of your business depends as much – if not more – on the effectiveness and efficiencies of your processes and procedures," says David Kinlaw, Practice Manager, Data Protection and Availability Services, Logicalis US. "Critically reviewing, evaluating and improving those processes and procedures is therefore essential to ensuring the success of your business."

That's because the true value of business continuity planning is not limited to technology. Done correctly, the exercise of developing and implementing a thorough business continuity plan opens ongoing conversations between IT and business units, empowering them as a team to face whatever challenges lie ahead. Combine a well-implemented disaster recovery or DRaaS plan with a strong business continuity strategy and the organization will have a winning combination for long-term sustainability.



For a while, the general assumption was that Ethernet would supplant all things Fibre Channel in the data center. But the rise of cloud computing and virtualization has created demand for more storage bandwidth than ever.

Rising to the challenge, Cisco this week made additions to its storage area network (SAN) lineup that not only provide 16G of bandwidth, but are also much simpler to manage by both automating the provisioning process and providing tools for detecting network congestion and recovery logic that helps ensure application performance requirements are continuously met.

Nitin Garg, senior manager for product management in the data center switching group at Cisco, says it is now much simpler to provision the Cisco MDS 9148S 16G Fabric Switch, the Cisco MDS 9706 Storage Director, and the Cisco MDS 9700 FCoE Module for multi-protocol networking fabrics.



In its fifth annual board of directors survey, “Concerns About Risks Confronting Boards,” EisnerAmper surveyed directors serving on the boards of more than 250 publicly traded, private, not-for-profit, and private equity-owned companies to find out what is being discussed in American boardrooms and, in turn, what those boards are accomplishing as a result.

According to the report, reputation remains the top concern across a range of industries:



(MCT) — When a major hurricane strikes the Gulf Coast again — as it inevitably will — the federal government will undoubtedly respond in some manner, just as it did after hurricanes Rita and Ike. But the key word in that sentence is "after." The damage will have been done, and coastal residents will bear the brunt of the recovery.

A new study by the National Research Council reinforces that reality. It encourages state and local governments to do all they can now to minimize devastation from hurricanes instead of hoping that Washington will ride to the rescue afterward.

That makes sense. Congress is usually slow to act when disasters strike, and the Federal Emergency Management Agency has a spotty record — even if it has improved in recent years. Responsibility for hurricane risk is scattered among many governmental agencies, the study says, yet collectively they are doing little about protecting coasts before storms strike.



We all rely on USB to interconnect our digital lives, but new research first reported by Wiredreveals that there's a fundamental security flaw in the very way that the humble Universal Serial Bus functions, and it could be exploited to wreak havoc on any computer.

Wired reports that security researchers Karsten Nohl and Jakob Lell have reverse engineered the firmware that controls the basic communication functions of USB. Not only that, the've also written a piece of malware, called BadUSB, that can "be installed on a USB device to completely take over a PC, invisibly alter files installed from the memory stick, or even redirect the user's internet traffic."

Embedded within USB devices—from thumb drives thorough keyboards to smartphones—is a controller chip which allows the device and a computer it's connected to send information back and forth. It's this that Nohl and Lell have targeted, which means their malware doesn't sit in flash memory, but rather is hidden away in firmware, undeletable by all but the most technically knowledgable. Lell explained to Wired:



As the deadliest outbreak of Ebola in recorded history continues to devastate Western Africa, the American Red Cross is supporting efforts through both financial and staffing support.

While the Sierra Leone Red Cross is taking the lead in promoting awareness through social mobilization campaigns, the American Red Cross, along with the global Red Cross network, is helping amplify efforts and strengthen capacity. An American Red Cross specialist has been deployed to provide telecommunications support and internet to the health team in country, and follows another IT specialist that had been in Sierra Leone for the past month.

The American Red Cross has also assisted with remote mapping and information management in the region and has contributed $100,000 to strengthen the capacities of both the Liberia Red Cross and Guinea Red Cross. These funds will help manage the Ebola outbreak response and increase public awareness of the virus.

Red Cross volunteers in the region working to assist with Ebola awareness efforts. In total, more than 1,200 volunteers have been mobilized in Sierra Leone, Liberia and Guinea to date.

Since March 2014, some 1,200 cases have been reported and more than 670 deaths have been linked to the virus in Sierra Leone, Liberia, Guinea and most recently, Nigeria.

Currently outbreaks are centered in the cities of Kailahun and Kenema in Sierra Leone, and the counties of Lofa and Montserrado in Liberia.

Recognizing the severity of the issue, Liberian President Ellen Johnson Sirleaf has announced the closure of most of Liberia’s borders, with stringent medical checks being stepped up at airports and major trade routes. The government has also banned public gatherings of any kind, including events and demonstrations.

Difficulties remain in identifying cases, tracing contacts, and raising public awareness about the disease and how to reduce the risk of transmission. These difficulties, including widespread misconception, resistance, denial and occasional hostility, are considerably complicating the humanitarian response to containing the outbreak.

For more information on the Ebola outbreak and response, visit http://www.ifrc.org.

One of the more frustrating things about IT is that in the wake of the consumerization of IT, no matter how hard internal IT departments try, they can’t wean end users off shadow IT services. Much of that has to do with the user experience those services provide. Designed for consumers, they tend to be a lot simpler to use than applications delivered by the enterprise IT organization. The simple fact is that in order for internal IT organizations to win that battle, they have to deliver an application that provides a much better customer experience than the consumer application they are trying to replace.

With that goal in mind, EMC Syncplicity has delivered an upgrade to its file transfer and synchronization software for Apple iOS devices that makes it easier to not only surface the most relevant and pertinent content, but also predicts which content an end user is likely to want to access next.



By Deborah Ritchie

A report from the Information Commissioner’s Office sets out how the law applies when big data uses personal information. It details which aspects of the law organisations need to particularly consider. Big data is a way of analysing data that typically uses massive datasets, brings together data from different sources and can analyse the data in real time. It often uses personal data, be that looking at broad trends in aggregated sets of data or creating detailed profiles in relation to individuals, for example lending or insurance decisions.

Some commentators have argued that existing data protection law can’t keep up with the rise of big data and its new and innovative approaches to personal data. That is not the view of the ICO, which stresses the basic data protection principles already established in UK and EU law are flexible enough to cover big data. “Applying those principles involves asking all the questions that anyone undertaking big data ought to be asking,” the report reads. “Big data is not a game that is played by different rules.”



(MCT) — During severe weather, Carla Kerr, her daughter and her mother bunker down in their 10-foot-long bathroom on the first floor. With blankets, a flashlight and a weather radio, it’s a bit of a tight fit.

As residents of Guinotte Manor, a public housing complex in Kansas City, they don’t have basements where they can take cover from tornadoes.

At the end of next summer, Kerr will have a safer solution across the street.

The Garrison Community Center will start construction on a safe room this summer, said Bob Lawler, project manager of Kansas City Parks & Recreation Department. The safe room will be able to withstand the highest-rated tornadoes while holding 1,300 occupants, close to the estimated number of residents within a half-mile radius.



A survey into cyber security in the retail sector suggest that a number of organisations don’t realise the goal of PCI compliance is the protection of cardholder data alone – not for the business as a whole.

Conducted by Dimensional and Atomik Research and sponsored by Tripwire, the survey evaluated the attitudes of 407 retail and financial services organisations in the US and the UK on a variety of cyber security topics.

Despite industry data to the contrary, Tripwire’s retail cybersecurity survey indicates that organisations that rely on PCI compliance as the core of their information security program were twice as confident that they could detect rogue applications, such as those used to exfiltrate data. These respondents were also significantly more confident that they would be able to detect misconfigured or unauthorised network shares, which was a key attack vector exploited in the Target data breach.



Ensuring employee safety by rapidly disseminating the right information, and keeping communication lines open in a time of crisis are both priorities for businesses. Traditional solutions for this have relied on the manual ‘call tree’ or ‘phone tree’. Key employees are contacted first to inform them of whatever situation or crisis has arisen, with remaining staff to be contacted as soon as possible afterwards. However, even for smaller organisations of 100 people for example, the manual call tree rapidly demonstrates its limitations. For larger enterprises, there is no doubt – a better solution is required.



MONTGOMERY, Ala. – Community Emergency Response Teams prepare for the worst, then when disaster strikes, they help themselves, their families, their neighborhoods and their communities.

Begun in Los Angeles in 1985, the CERT program consists of specially trained volunteers who are called into action during and immediately following major disasters before first responders can reach the affected areas. They work closely with fire and emergency management departments in their communities.

More than 2,200 CERT programs are available in the United States. In Alabama, 10 counties offer CERT training and maintain teams. During a disaster, Alabama CERT members may self-deploy in their neighborhoods, be mobilized by a sheriff’s office or report to a pre-determined location.  

“CERT groups provide immediate assistance to people in their areas and lead spontaneous volunteers before we can get to the area and inform emergency management of what the needs are,” said Art Faulkner, director of Alabama Emergency Management.

Billy Green, Deputy Director of Emergency Management for Tuscaloosa County, had just finished a training class for Hispanic CERT volunteers the week before the tornado outbreak of April 2011 in Alabama.

“We finished on the Saturday before the tornadoes hit,” he said. “These Spanish speakers took exactly what they learned and put it out in the field. The City of Holt has a high Hispanic population, and this team was able to go out there and do search and rescues.”

Holy Spirit Catholic Church set up its own shelter for the Hispanic population, he added. “Those guys were in that shelter helping and making sure everyone was all right.”

This April’s severe weather and flooding caught many Mobile County residents by surprise, said Mike Evans, Deputy Director of Mobile County Emergency Management Agency.

“Mobile gets the most rainfall of anywhere in the continental United States with 67 inches per year,” he said. “This wasn’t like during hurricane season; getting a lot of rain and thunderstorms is pretty common. But areas that normally flood didn’t, it was urban areas.”

Since the ground was already saturated, the rain had nowhere to go so roads that were low flooded, he said.

“People tried to drive through and we had to get them out,” Evans said.

CERTs distributed commodities and one team knocked on doors asking who was going to leave the area and who was going to stay, he said. After the storm, his teams notified people who left the area of the status of their property.

CERTs can also work with crowd and traffic control, work at water stations at large events, help community members prepare for emergencies, and assist with fire suppression and medical operations as well as search and rescue operations.

Initially, CERT members take training classes that cover components of disaster activities, including disaster preparedness, fire suppression, medical operations, search and rescue and disaster psychology and team organization. Additional training occurs twice a year with mock disasters. Refresher courses are also held. The Federal Emergency Management Agency supports CERT by conducting or sponsoring train-the-trainer and program manager courses for members of the fire, medical and emergency management community, who then train individual CERTs.

CERTs are organized in the Alabama counties of Dale, DeKalb, Shelby, Morgan, Tallapoosa, Jefferson, Colbert, Calhoun, Russell and Coffee.

To join an existing CERT program in your community, go online to www.fema.gov/community-emergency-response-teams. Click on the “find nearby CERT programs” link and enter your zip code. If there is a team near you, you will see the name and phone number of a contact person as well as pertinent information about the local program.

That site can also provide information on how to build and train your own community CERT, the curriculum for training members as well as how to register the program with FEMA.

Aside from providing a vital community service, CERT members receive professionally recognized training and continue to increase their skills.

“CERTs complement and enhance first-response capabilities by ensuring safety of themselves and their families, working outward to the neighborhood and beyond until first responders arrive,” said FEMA’s Federal Coordinating Officer Albie Lewis. “They are one of the many volunteer organizations that we rely on during a disaster.”

The industry is so focused right now on Big Data and the Internet of Things that it’s hard to write about anything else. But it’s important to remember that some organizations are still struggling with more basic data problems.

Government Technology recently published a contributed piece about Lodi, California, a town of about 60,000 people and a $350 million wine industry.

Jay Mishra, VP of development at Astera Software, wrote the piece, and it’s pretty obvious he’s promoting the company’s own ETL solution.



More than 175 million records were compromised between April and June due to 237 data breaches, bringing the 2014 total to 375 million records affected and 559 data breaches. That’s a lot of records illegally accessed for less than 1,000 breaches worldwide. What this tells me is that even SMBs store a lot more records than they may realize, and a single data breach can result in a huge payoff for a hacker.

These numbers are from SafeNet’s Breach Level Index second quarter report. The report found that retail was the hardest hit industry, with more than 145 million records stolen, or 83 percent of all data records breached, according to a release.

Here is an important finding in the report: Less than 1 percent of all of the data breaches in the second quarter happened to networks that used encryption or strong security platforms to protect the data. So, no, not every security system is foolproof, but you greatly improve your chances of avoiding a breach if you put strong security practices in place. At the same time, it is a little scary to think how many businesses are still lacking when it comes to network security. Good security is vital to any company’s success, and a second report from SafeNet shows why. Once a customer discovers a company has been breached, he or she is not likely returning. As Yahoo Finance reported:



This story was originally published by Data-Smart City Solutions.

Data science and big data are hot topics in today’s business and academic environments. Corporations in a variety of industries are building teams of data scientists. Universities can barely keep up with student demand for courses. The hope is that new analytic methods, combined with more data and computational power, will uncover insights that would otherwise remain undiscovered.  In the private sector, these new insights lead to new revenue opportunities and more targeted investments.



This week I’m back at the National Emergency Training Center (NETC) in Emmitsburg, MD. If you’ve read some of my past blogs, you’ll know that this is “home base” for National Community Emergency Response Team (CERT) training. Even though this isn’t my “first rodeo” at the NETC, I still find it an honor whenever I get the opportunity to teach here. There’s so much history in this region of the United States as well as on the campus that houses the NETC. Throughout the week, I hope to share a few of the stories and sites that make this such a special place to come to.

The campus

The NETC is home to both the National Fire Academy (NFA) and the Emergency Management Institute (EMI). The 107-acre campus was the original site of Saint Joseph’s Academy, a Catholic school for girls from 1809 until 1973. It was purchased by the U.S. Government in 1979 for use as the NETC.

The National Fire Academy (NFA) is one of two schools in the United States operated by the Federal Emergency Management Agency (FEMA) at the NETC. Operated and governed by the United States Fire Administration (USFA) as part of the U.S. Department of Homeland Security (DHS), the NFA is the country’s pre-eminent federal fire training and education institution. The original purpose of the NFA as detailed in a 1973 report to Congress was to “function as the core of the Nation’s efforts in fire service education—feeding out model programs, curricula, and information.



Apache’s open source Storm may be the big buzz in Big Data streaming analytics, but according to a recent Forrester report, the commercial vendors are the ones who have “got the goods.”

While Storm is used by a number of high-profile companies, including the Weather Channel, Spotify and Twitter, the research firm writes that it’s nonetheless “a very technical platform that lacks the higher order tools and streaming operators that are provided by the vendor platforms evaluated in this Forrester Wave …”

In its July report on Big Data Streaming Analytics Platforms, the research firm reviewed seven platforms: IBM, Informatica, SAP, Software AG, SQLstream, TIBCO and Vitria. Forrester assessed each on 50 criteria, including business application and platform integration, data sources, development tools, ability to execute, partnerships and pricing.



Thursday, 31 July 2014 16:45

The Changing Data Center Climate

Ask a roomful of IT experts what the future holds for the data center and you’re likely to get a roomful of different opinions. This is doubly true during periods of revolutionary change like we are seeing now.

With outlooks ranging from all-cloud, all-software constructs to massive hyperscale infrastructure tailored toward specific web-facing or Big Data workloads, it seems that the enterprise has a range of options when it comes to building next-generation infrastructure.

Even during times of heady change, however, it is still useful to anticipate the future by analyzing the past. TechNavio, for example, has noticed that rack units have nearly doubled in size over the past decade, leading the firm to conclude that future data centers will feature higher ceilings and taller equipment racks. A key driver in this is the rising cost of property, which is causing designers to build up rather than out. But it also has to do with the need for increased densities and the prevalence of wireless connectivity, which reduces the need for bulky cables.



KPMG’s UK and global lead in KPMG’s cyber security practice, Malcolm Marshall, is warning organizations about the impact that international political disputes can have on the ability to conduct ‘business as usual’. He suggests that, “whilst attention is focused on the search for resolutions in the ‘corridors of power’, businesses need to be ready to defend themselves, as the cyberspace in which they operate increasingly becomes the new battleground.”

Mr. Marshall says: “Businesses are so focused on cyber-attacks by organized crime that it is easy for them to ignore the possibility of being targeted by groups wanting to make a political point, possibly even with backing from a hostile government.



The International Federation of Risk and Insurance Management Associations (IFRIMA) has established a working group to define ‘the core knowledge and competencies that lie at the heart of risk management in whatever context it is practiced’.

FERMA, RIMS, Alarys and the Institute of Risk Management (IRM) are among the organizations taking part.

The aim of the working group is to produce a short document that any risk management body can use as the foundation of a risk management education and /or certification process.

Publication is planned for sometime in 2015.

More details.

Employing a third party to store and deliver assets critical to Disaster Recovery or Business Continuity Plans can be invaluable.  But offsite storage should never be “dump it and forget it”.  Despite everything your storage provider may promise, it’s what you don’t know that could become a problem when you need to retrieve your data backup, ‘go box’ or other essential recovery assets.

First, there’s the hand-off process.  If your IT team ships physical backups offsite on a regular basis, that process can become routine.  Over time, routine can slip into neglect. Neglect can result in outcomes that may a problem – or a disaster – when it’s time to recall those backups.  And if you are using internal means to store vital assets, understanding the process and its security is just as critical – perhaps even greater.

What is the process?  Is it documented? Is it verified with the vendor/provider periodically?  Take the time to visit the provider (or even follow their pickup agent) to see exactly how the process works.  Ask to see your stored materials, the vendor’s logs and their entry procedures in action.



Civic technologist Matt Stempeck makes an unusual proposal in a recent Harvard Business Review post (registration required): Businesses, especially in the tech sector, should consider donating data over dollars.

Stempeck draws the idea from the International Charter on Space and Major Disasters, a 1999 act that required satellite companies to provide imagery to public agencies in times of crisis. Stempeck points out that under that act, DMC International Imaging has provided valuable imagery on:

  • Flooding in the UK and Zimbabwe
  • The spread of lotus in Algeria
  • Fires in India
  • Snow in South Korea



Study looks at more than 60 years of coastal water level and local elevation data changes

Annapolis, Maryland, pictured here in 2012, saw the greatest increase in nuisance flooding in a recent NOAA study. (Credit: With permission from Amy McGovern.)

Annapolis, Maryland, pictured here in 2012, saw the greatest increase in nuisance flooding in a recent NOAA study. (Credit: With permission from Amy McGovern.)

Eight of the top 10 U.S. cities that have seen an increase in so-called “nuisance flooding”--which causes such public inconveniences as frequent road closures, overwhelmed storm drains and compromised infrastructure--are on the East Coast, according to a new NOAA technical report.

This nuisance flooding, caused by rising sea levels, has increased on all three U.S. coasts, between 300 and 925 percent since the 1960s.

The report, Sea Level Rise and Nuisance Flood Frequency Changes around the United States, also finds Annapolis and Baltimore, Maryland, lead the list with an increase in number of flood days of more than 920 percent since 1960. Port Isabel, Texas, along the Gulf coast, showed an increase of 547 percent, and nuisance flood days in San Francisco, California increased 364 percent.

"Achieving resilience requires understanding environmental threats and vulnerabilities to combat issues like sea level rise," says Holly Bamford, Ph.D., NOAA assistant administrator of the National Ocean Service. "The nuisance flood study provides the kind of actionable environmental intelligence that can guide coastal resilience efforts."

“As relative sea level increases, it no longer takes a strong storm or a hurricane to cause flooding,” said William Sweet, Ph.D., oceanographer at NOAA’s Center for Operational Oceanographic Products and Services (CO-OPS) and the report’s lead author. “Flooding now occurs with high tides in many locations due to climate-related sea level rise, land subsidence and the loss of natural barriers. The effects of rising sea levels along most of the continental U.S. coastline are only going to become more noticeable and much more severe in the coming decades, probably more so than any other climate-change related factor.”  

The study was conducted by scientists at CO-OPS, who looked at data from 45 NOAA water level gauges with long data records around the country and compared that to reports of number of days of nuisance floods.

Nuisance flooding events have increased around the U.S., but especially off the East Coast. Click graphic for high resolution PDF. (Credit: NOAA)

Nuisance flooding events have increased around the U.S., but especially off the East Coast. Click graphic for high resolution PDF. (Credit: NOAA)

The extent of nuisance flooding depends on multiple factors, including topography and land cover. The study defines nuisance flooding as a daily rise in water level above the minor flooding threshold set locally by NOAA’s National Weather Service, and focused on coastal areas at or below these levels that are especially susceptible to flooding.

The report concludes that any acceleration in sea level rise that is predicted to occur this century will further intensify nuisance flooding impacts over time, and will further reduce the time between flood events.

The report provides critical NOAA environmental data that can help coastal communities assess flooding risk, develop ways to mitigate and adapt to the effects of sea level rise, and improve coastal resiliency in the face of climate- and weather-induced changes.

NOAA's mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Twitter, Facebook and our other social media channels.

Top ten U.S. areas with an increase nuisance flooding*

“Nuisance level”:

Meters above mean higher high water mark

Average nuisance flood days, 1957-1963

Average nuisance flood days, 2007-2013

Percent Increase

Annapolis, Md.





Baltimore, Md.





Atlantic City, N.J.





Philadelphia, Pa.





Sandy Hook, N.J.





Port Isabel, Texas





Charleston, S.C.





Washington, D.C.





San Francisco, Calif.





Norfolk, Va.






* More than one flood on average between 1957-1963, and for nuisance levels above 0.25 meters.

As the enterprise delves ever deeper into virtual and cloud infrastructure, one salient fact is becoming clearer: Attributes like scalability and flexibility are not part and parcel to the technology. They must be developed and integrated into the environment so that they can truly provide the benefits that users expect.

Even at this early stage of the cloud transition, providers are already feeling the blowback that comes from overpromising and under-delivering. According to a recent study by Enterprise Management Associates (EMA), one third of IT executives say they found scaling either up or down to be not as easy as they were led to believe. With data loads ebbing and flowing in a continual and often chaotic fashion, just trying to match loads with available resources is a challenge even with modern automation and orchestration software.



Tuesday, 29 July 2014 15:33

New Online Tool Tracks Shoreline Shifts

(MCT) — The coast's susceptibility to big storms is clearly no secret, but ever wonder what the shoreline looked like 100 years ago? Or about the rate at which sea level is changing? The U.S. Geological Survey released an interactive website last week that will allow Coastians to easily research coastal changes.

The tool, called the USGS Coastal Change Hazards Portal, shows changing sea levels, retreating shorelines and vulnerability to extreme coastal storms. A link to the site can be found at sunherald.com.

USGS research geologist Robert Thieler said a large driver behind the portal, which became available July 16, was to bring the three research themes together into one easy-to-use website. He said the functionality of the site and the value of the information make it a useful tool for the general public as well as city and county officials.



All organizations with a Business Continuity Management (BCM) or Disaster Recovery (DR) program always strive to have their Business Continuity Plans (BCP) / Disaster Recovery Plans (DRP) in a state they can use: in a state they believe will cover them in any and all situations. They want their plans to at least cover the basic minimum so that they can be responsive to any situation. But if an organization takes its program – and related plans – seriously, then these plans are never fully complete.

For a plan to be truly viable and robust, it must be able to address as many possible situations as possible while at the same time must have the flexible enough to adapt to any potential unknown situations. If it’s ‘carved in stone’ it makes a bit tough to adapt the plan to the situation (the situation won’t adapt to your plan).

This flexibility – and it’s maintenance (which keeps the plan alive) – includes incorporating lessons learned captured from news headlines and then incorporating the potential new activities or considerations that may not be in the current BCM / DRP plan. These plans aren’t quick fixes or static responses to disasters; they are ‘living and breathing’ documents that need new information to grow and become robust. This is why they should never be considered as complete; as the organization grows and changes – and the circumstances surrounding the organization changes – so to must the BCM and DRP plans.



Many of the attacks launched against today’s brands are as covert as they are debilitating. In today’s connected age, savvy cyber criminals often blitz companies with a flurry of activity across an array of online channels.

To make matters worse, employees who are using the Internet casually or personally create a vulnerability for businesses: workers could click on a phishing link sent to their personal account and unknowingly be exploited by cyber criminals, or they could bring harm to the business via a social media post they thought to be harmless.

And, let’s not forget that brands can also inflict damage on themselves, such as through executive scandals, accounting errors or failing to protect customers and investors. Even though these events may not involve a malevolent, third-party attacker, the resulting fallout can be just as severe as if they fell prey to one.



Another American aid worker has become infected with the Ebola Virus in Liberia. The patient has been identified as Nancy Writebol, a worker with the Christian aid group Serving in Mission (SIM) which runs a hospital on the outskirts of the Liberian capital Monrovia. Writebol had been working as a hygienist responsible for decontaminating those coming and going from the hospital’s Ebola care center.

The diagnosis follows that of Dr. Ken Brantley, a doctor who was working in the same care center for the allied aid group Samaritan’s Purse. Both Americans are said to be in stable but serious condition. Ken Isaacs, a vice president at Samaritan’s Purse, told the AP of the team’s declining morale: “It’s been a shock to everyone on our team to have two of our players get pounded with the disease… Our team is frankly getting tired.”

The two cases fall in the midst of an historically devastating outbreak which has killed 129 people in Liberia, and more than 670 throughout West Africa. The highly contagious disease has no cure or vaccine, meaning aid workers must prevent the spread of the disease altogether. Thus, aid workers and the WHO have to heavily rely on public education and social mobilization to prevent activities that pose a high risk of transmission. Ebola is spread through direct contact with bodily fluids and organs, so these activities include close contact with infected people and inefficient burial of the dead.



(MCT) — Saying California’s emergency responders need more training to handle major calamities, state and local leaders are pitching plans to build a world-class $56 million training facility in eastern Sacramento County that would pit fire crews against a variety of realistic, pressure-packed simulated disasters.

Emergency crews would be required to douse a real 727 jet as it lies in pieces across a field after a simulated crash at the training site; or make split-second decisions on how to approach a derailed train leaking crude oil; or figure out how to quickly pull survivors out of a partially demolished and unstable building after a terrorist bombing or earthquake.

Initial construction on the Emergency Response Training Center has begun on 53 acres east of Mather Field in Rancho Cordova. The facility, billed as one of the most varied, modern and sophisticated training sites in the country, would be “a total disaster city,” said Sacramento Metropolitan Fire District Chief Kurt Henke, one of the officials behind the push.



Organizations invest hundreds of thousands of dollars of redundant hardware and software into their data centers to ensure high availability (the five 9s, again) and resiliency. These same companies hire IT professionals with very specialized experience and certifications to ensure their capital investments maintain high availability and data security on a day-to-day basis.

High availability within the organization shouldn’t just be identified with a Server failure or outage. Serious impacts to your business can be caused by natural disasters (like hurricanes or floods), strikes, major highway closures, terrorist events, all without a single server ever going down! Loss of sales, regulatory fines, contract penalties, getting employees to your business, loss of suppliers, branding, reputation, can all easily be affected, and for very long periods of time, without directly touching the five 9s.

Yet these very same companies often overlook the fact that these same IT professionals sometimes aren’t trained or experienced in recovering from a disaster and ensuring business continuity. Business continuity planning requires a high-level view of IT, but more importantly, a rock-solid understanding of business processes and the potential consequences of natural disasters, strikes, highway closures, terrorist events (and so on) on the business. FEMA has some very valuable information on Business Impact Analysis (BIA) as well as operational and fin­­ancial impacts. Ironically, IT doesn’t even make the list!



With the referendum on independence for Scotland not far away, the Business Continuity Institute (BCI) has published a paper to help organisations on both sides of the border consider what the impact of independence could be for them and provide links and resources that will help organisations further understand the debate.

Whether it is a new government being elected or an international treaty signed, political change happens all the time and this will invariably have an impact on organisations. Priorities, budgets and regulations all change, resulting in organisations having to rethink their strategies. If disruption can occur when something relatively routine happens, what could happen when there is an entire change of sovereignty?

The BCI remains neutral in this debate but highlights that, whatever the outcome, there will be change in Scotland that will require organisations to reconsider their business continuity plans and strategies. Some of the key findings of the paper include:

  • The decision could significantly influence the conduct of business across the Anglo-Scottish border and organisations may have to adapt to a possible independence settlement.
  • Policy divergence in key areas such as taxation, business regulation and labour laws is likely to significantly affect highly regulated professions such as the finance and banking industries. For other sectors, policy divergence may result in an increase in operational and logistics costs.
  • A monetary union with rUK that allows an independent Scotland to use the pound sterling will keep transaction and borrowing costs to a minimum. However, this may come at the expense of setting independent Scottish monetary policy that may have effects on some businesses. Meanwhile, adopting a different currency for an independent Scotland carries costs so organisations need to have financial safeguards in either case.
  • Both sides of the border enjoy interconnected infrastructure and this is not likely to change following independence. However, two sets of policies for critical infrastructure are likely to emerge due to inherent strategic differences of both countries and this may influence the operational terrain for businesses operating on both sides of the border.
  • Differences in energy production/consumption may influence sourcing arrangements and cause variations in fuel and energy costs for sectors such as manufacturing, logistics and retail.
  • An independent Scotland’s admittance to the EU is uncertain and the indefinite timescale of the accession process may introduce short-term uncertainty over the continued access to benefits provided by EU membership.

A change in the Anglo-Scottish relationship will carry far-reaching consequences to organisations and the way they operate across the border. Independence in itself carries potential opportunities and both Scotland and the rUK remain excellent markets with the highly interlinked trade between these countries not likely to change following independence. However, the change in their relationship will inevitably influence changes in business operations albeit to differing degrees. This is something that organisations must understand as they wait for the results of the independence referendum, and must plan for either way.

Lyndon Bird, Technical Director at the BCI, commented: “This is a big decision for the people of Scotland that could have far reaching consequences for organisations on both sides of the border and further beyond. Like with any major event however, looking ahead, establishing what the impact could be and making sure plans are in place to deal with these should allow any organisation to operate as normal during abnormal circumstances.”

Whilst the outcome of the vote remains uncertain, maintaining continuity amidst political change is a constant. The Scottish independence referendum is a unique event that may influence the conduct of business on both sides of the border. It is essential that organisations know what is at stake as their capacity to adapt will determine their viability regardless of independence or continued union.

To read the full report, click here.

Based in Caversham, United Kingdom, the Business Continuity Institute (BCI) was established in 1994 to promote the art and science of business continuity worldwide and to assist organizations in preparing for and surviving minor and large-scale man-made and natural disasters.  The Institute enables members to obtain guidance and support from their fellow practitioners and offers professional training and certification programmes to disseminate and validate the highest standards of competence and ethics.  It has circa 8,000 members in more than 100 countries, who are active in an estimated 3,000 organizations in private, public and third sectors.

For more information go to: www.thebci.org

Friday, 25 July 2014 14:00

More Data, More Profit

Given the high profile of Big Data, mobile and data analytics, marketing should be a huge fan of data integration. “More data, more profit,” should be marketing’s motto.

However, if you happen to be working with the most backward marketing division in the world, you might want to send them this recent post, “How Data Integration Tools can Turbocharge Your Marketing.”

The post, by a freelance writer who boasts some programming experience, makes an excellent case for the value of data integration. The post casts a wide net, touting data integration’s ability to:

  • “Reduce friction” in sales cycles.
  • Develop a “customer-centric approach to data.”
  • Improve relationships with customers by maximizing demographic information.
  • Combine basic demographic data with sentiment data to help create calls-to-action.
  • Break down data silos, so you know more about life-long customers.



As compelling an IT opportunity as Big Data can be, it’s not without its challenges—not the least of which is securing all that data.

Looking to make it easier to apply encryption to distributions of Hadoop, Zettaset today announced it is making its encryption software available independently of the Orchestrator platform it created to manage Big Data security.

Zettaset CEO Jim Vogt says the Big Data encryption tools that Zettaset developed are compatible with Apache-based Hadoop distributions available from Hortonworks, Cloudera, MapR, Pivotal, Teradata, and IBM as well as Cassandra and Couchbase NoSQL databases. The tools can also now be accessed via third-party management consoles. That’s critical, says Vogt, because it allows IT organizations to apply encryption to Big Data at rest using the management tools in which they have already invested.



Last winter heavy rain, storm force winds and large waves combined with high spring tides presented England with unprecedented flooding from the sea, rivers, groundwater and surface water.

Thousands of properties were flooded, infrastructure was damaged and tragically, eight people lost their lives. The full impact of these events has not yet been calculated but we do know that 175,000 businesses in England are at risk of flooding.  

In 2012 flooding cost affected businesses an average of £60,000, so it is not surprising that flooding is a national priority. In fact the National Risk Register of Civil Emergencies cites coastal flooding as the second highest priority risk after pandemic flu and ahead of catastrophic terrorist attack (taking both likelihood and impact into account) .



No news is good news, or so the saying goes. But when equipment malfunctions and services are interrupted, no news can mean intense frustration for customers and end-users. In today’s quality and satisfaction-oriented business world, you might think that major corporations had understood the importance of good crisis communication. And to be fair, many now make efforts to keep customers informed of the causes of business interruption, the solutions being put in place, and the estimated time when normal service will be resumed. That’s what makes behaviour around a recent outage by one of the top IT and cloud service vendors so hard to fathom.



Ted Julian describes five steps that will help ensure that your incident response plans work when they are required.

Even in the most carefully thought out incident response (IR) plans, there is room for continual improvement. Anyone who has put a response plan into action knows there is a gulf between the theoretical plan and what actually happens given all the variables and complexities that inevitably occur. Because of this, plans often break down; particularly if they haven't been stress-tested based on different real world scenarios.

Whilst not everything will go according to schedule, a thoroughly tested and validated plan will minimise the impact of an incident which, in turn, leads to faster business recovery times. Indeed, no plan is complete until it has been tested with fire drills and functional exercises that assess its effectiveness and identify potential gaps.

Here we outline some practical steps to improving your incident response plan:



In situations where the fastest possible access to data is required – trading floors, for example – CIOs have traditionally turned to flash-based storage systems. No one disputes the performance advantages of flash over traditional disk or tape storage methods, but cost has always been a barrier to a wider adoption. Today, however, the flash technology that once made sense only when extreme high performance was required is now priced to attract the attention of CIOs from a wide range of mid-sized to large companies. To help CIOs determine if flash might be the right solution for their companies, Logicalis US has outlined six key reasons flash storage makes sense for fast access to mission-critical data in mainstream applications:

1. Boosting performance: purpose-built flash storage systems can deliver performance boosts in application response times, accelerated access to information, and increased power efficiency when compared to conventional spinning disks. And, because flash storage is powerful enough to support an organization’s most demanding virtualized cloud environments, along with online transaction processing (OLTP), client virtualization, and business analytic applications, it is garnering attention from performance-hungry CIOs looking for new ways to speed access to business information.



Thursday, 24 July 2014 16:07

Global terrorism fatalities up 30 percent

Over the last 12 months, global fatalities from acts of terrorism have risen 30% compared to the previous five year average, according to a new security monitoring service from global risk analytics company Maplecroft, which also identifies China, Egypt, Kenya and Libya as seeing the most significant increases in the risk of terrorist attacks.

The Maplecroft Terrorism and Security Dashboard (MTSD) recorded 18,668 fatalities in the 12 months prior to 1 July, up 29.3% from an annual average of 14,433 for the previous five years. Over the same period the MTSD recorded 9,471 attacks at an average of 26 a day, down from a five year average of 10,468, revealing that terrorist methods have become increasingly deadly over the last year.
The MTSD classifies 12 countries as ‘extreme risk,’ many of which are blighted by high levels of instability and weak governance. These include: Iraq (most at risk), Afghanistan (2nd), Pakistan (3rd), Somalia (4th), Yemen (6th), Syria (7th), Lebanon (9th) and Libya (10th). However, of particular concern for investors, the important growth economies of Nigeria (5th), the Philippines (8th), Colombia (11th) and Kenya (12th) also feature in the category.



Communication in the workplace is challenging enough under the best of circumstances, but in workplaces that can have as many as four generations struggling to communicate with each other, even simple exchanges can result not only in miscommunications, but in misunderstandings that can create serious problems.

One person who has given this problem a lot of thought is Dana Brownlee, a corporate trainer and management consultant whose background in technology includes stints at AT&T Bell Labs, IBM Global Services and EMC. In a recent interview on the topic of multigenerational communication issues in the workplace, I asked Brownlee if, in light of her technology background, she had any sense of whether these issues are more or less prevalent in an IT organization, compared to other organizations.

“My experience has been that IT is such a rapidly developing field, that there's a Darwinian effect that forces anyone who's successful in the field to change, learn, and adapt, early and often,” she said. “As a result, I've tended to see less of these generational communication issues in IT. I'm sure there are exceptions, but that's my general observation.”



Wednesday, 23 July 2014 16:20

The Resilience Upsurge

The overwhelming response to our range of programmes at Buckinghamshire New University has been indicative to us of the interest in our focus on our resilience, and our emphasis on the ‘New’ in our name. Resilience is not new – organisations have been good (or bad) at it for years. The upsurge in interest in Organisational Resilience is about the need to be able to understand, blend and apply the constituent elements – risk, impact, security, crisis, emergency, disaster, business continuity, change, personnel management are a few of them. With many specialists around who cover some of the areas but few, understandably, who cover all, our aim is to provide a resilience perspective to every programme that we run.

For the MSc Organisational Resilience that is a given. However, in our programmes on Cyber Security, Business Continuity and Security Consultancy, that same approach is applied. By looking outside the specialism, but by retaining that specialist focus, the effective resilience super-practitioner/manager/professional/director is able to contextualise their own actions, plans and ideas and to build and develop an interlocking and intertwined capability. Finally, we are beginning to see the need expressed by both specialists and non-specialists for such a capability to be developed. However, this is not an anodyne function that is grey and bland; it is a multi-faceted and interlinked organisational enhancement that offers significant challenges; it needs confident, capable and educated leaders.



The statement that investments in resilience pay huge dividends when disaster strikes rings true, but the conversation can’t end there. 

As a longtime local and state emergency management director, one of my final challenges remains unmet: the ability to gather the combined resources of a community to consider the challenges of restoration prior to a disaster.

Here’s why: Knowledge of risks is often known, but that information is diffused among a number of agencies. Those who know the most about risk rarely have an opportunity or a forum, outside of their own professional discipline, to educate or share their knowledge with others. We need discussions outside of our respective disciplines because no one group or profession possesses either all of the answers or a clear understanding of all of the negative impacts that could arise from a disaster.



Sexual assault is always avoidable.” Far short of the 140 characters allowed by Twitter, but enough to cause an immediate “twit storm.” The unfortunate tweet -- generated by a consultant hired by Massachusetts to handle its Twitter communiqués -- was meant to cap off the state’s recognition of Sexual Assault Awareness Month. If awareness was the tweet’s goal, it achieved it in spades. The tweet immediately set off a firestorm of controversy.

Joe Fitzgibbon stumbled into a similar twit storm. The Washington state representative tossed off a flippant -- but arguably amusing -- tweet after the Seattle Seahawks lost to the Arizona Cardinals in a football game last fall. “Losing a football game sucks,” Fitzgibbon wrote. “Losing to a desert racist wasteland sucks a lot.” The reference to Arizona’s arid climate and less-than-liberal immigration laws set off an interstate uproar, testimony to the power of a handful of words moving through the ether.

Words aren’t the only way Twitter can do damage. The New York City Police Department in April created a hashtag -- #myNYPD -- allowing citizens to quickly and easily post pictures to the department’s Twitter page of NYPD’s finest in action. The public largely responded by tweeting the department’s less-than-finest moments: a veritable gallery of the city’s men and women in blue clubbing, tear gassing, handcuffing and tackling Gotham citizens. “It was unfortunate to see what happened to the NYPD,” says Anil Chawla, author of an online white paper Twit Happens: How to Deal with Tweet Regret in the Public Sector. “It probably gives other government agencies pause.”



BSI has published a new white paper which explains why the benefits of BCM ‘go far beyond helping organizations recover from unexpected disruptions’.

The executive summary reads as follows:

  • BCM is a critical business discipline, helping organizations prepare for, and recover from, a wide range of unexpected incidents and unwelcome interruptions.
  • The importance of recovery – the most obvious purpose of BCM – can hardly be overstated. Thousands of businesses have saved time and money by getting back up and running quickly after a disruption – and some even owe their survival to it.
  • Despite the ‘recovery rationale’ for BCM, many business leaders are yet to embrace the discipline – while others are implementing it piecemeal or poorly.
  • Organizations are missing out on more – much more – than simply a speedy return to ‘business as usual’ in the event of disruption. BCM can provide a rich return on investment (ROI) without the occurrence of a disaster.
  • A robust BCM process offers many advantages, from lower insurance premiums and process improvements to business expansion and brand enhancement.
  • At a strategic level, BCM can play a key part in organizations’ risk management processes, answering to the demands of today’s onerous regulatory and corporate governance requirements.
  • It is time for the ‘C-suite’ to wake up to the full range of BCM benefits and the true ROI the discipline offers.
  • Help is at hand: the management system standard ISO 22301 provides the ideal framework for implementing a BCM system.
  • Many organizations, both large and small, have already implemented ISO 22301, harnessing a host of benefits from this multi-faceted standard.
  • Some have maximized the benefits by achieving independent third party certification to ISO 22301, enabling them to demonstrate ‘badge on the wall’ best practice in this vital area.
  • There is a growing trend for companies to be required to hold certification to ISO 22301 by powerful private and public sector customers – or risk losing business.

Read the white paper (PDF).

Business continuity problems often carry their own penalty in the form of lost revenue, customer churn and reputational damage. In some cases, outages also mean stiff fines that go beyond the penalties that are part of any service level agreement. Thus, SingTel, the Singaporean telecommunications company, received a 6 million dollar fine (about 4.81 million USD) from the ICT regulator in Singapore for a breakdown in service in October 2013. The disruption affected government agencies and financial institutions and had an impact on 270,000 subscribers. But what is really behind fining a company whose business continuity fails like this?



Among the many unwitting assumptions that occur when developing Business Continuity plans is the assignment of roles to specific individuals.  A smart BCM planner will at least buttress that assignment with a backup person – just in case.  But is that really enough?

In many cases is should be.  But there are many others in which roles assigned to individuals (even with a backup) may prove wholly inadequate.

The most obvious is in a natural disaster scenario.  What if the individuals cannot be contacted or the roads are closed?  Those may cause a temporary problem.  On the other hand, consider that the individuals may have other priorities – like protecting their family, or their home.  Those priorities may not mean they can’t respond to their BCM obligations – but they may be unwilling.  If you’ve assigned tasks to an individual who doesn’t show up, you’ll have to scramble to reassign the task (perhaps to someone unfamiliar with the role and responsibilities).  And even in a non-disaster situation, a named individual may be on holiday or away on business.  They can’t help if they’re not there.



If circumventing the IT department is “kind of a given,” as one executive from a cloud services provider put it in a post I wrote earlier this month, it may be just as much of a given that what business units are most eager to acquire when they do circumvent the IT department are business intelligence and data analytics tools.

I recently discussed this phenomenon in an email interview with Fred Shilmover, CEO of InsightSquared, a provider of cloud-based BI services in Cambridge, Mass. When I asked Shilmover if he’s finding that business units are circumventing the IT department in order to get the data analytics tools they need, Shilmover said they see it all the time among companies that have purchased a lot of cloud-hosted software, and he noted that the tools are often purchased by a new line-of-business leader:



Residents of Alaska have historically been more likely than people in other states to have a supply of frozen food on hand, but their reliance on food from stores has grown in recent years, leaving them vulnerable in an emergency.

Like every other state, Alaska has to be prepared for disasters, both natural and man-made. But as it works to make sure its residents would have enough food in a disaster, the state also has to deal with some unique challenges.

“We’ve got volcanoes, earthquakes, cold weather — a lot of potential for emergencies up here,” said Danny Consenstein, state executive director for the U.S. Department of Agriculture Farm Service Agency in Alaska. “Do we have a food system that is resilient and strong, that could help us in case of emergencies?”



Tuesday, 22 July 2014 15:22

AIR releases Canada earthquake model

Catastrophe risk modelling firm AIR Worldwide has updated its earthquake model for Canada. The comprehensive update will provide insurers and other industry stakeholders with an advanced tool for assessing potential losses from ground shaking, fire following earthquake, tsunami, liquefaction, and landslide for the Canadian market and will be a significant tool for compliance with OSFI Guideline B-9.

"The updated Earthquake Model for Canada has been extensively reengineered and offers significant enhancements," said Dr Jayanta Guin, executive vice-president, research and modelling, AIR Worldwide. "The model reflects an up-to-date view of seismicity based on the latest hazard information from the Geological Survey of Canada and collaboration with leading academics. In addition to the ability to estimate losses from shake, fire following, and liquefaction, the release is the first in the industry to include fully probabilistic landslide and tsunami models for Canada. Virtually every component of the updated model has undergone peer review."

Testament to the sophistication of the model, the Insurance Bureau of Canada selected AIR Worldwide to conduct the most comprehensive study of seismic risk in Canada ever undertaken. According to IBC, AIR's study will help drive a national discourse on mitigation, financial preparedness, and emergency response.



The final attribute of the RIMS Risk Maturity Model should be of great interest to risk managers responsible for establishing an enterprise risk management (ERM) program. Without some level of business resilience and sustainability built into your program, the iterative, cultural changes that are created by the ERM process will wane and your exposure to loss events will increase.

Understanding Consequences

Traditionally, business continuity plans have focused on technology platforms, but resiliency means much more than ensuring that your information technology infrastructure is prepared for disaster recovery. Consider that the IT infrastructure that is the focus of your business continuity plans is likely to play a critical role in the execution of your mitigation activities (for example, a server that supports access rights and security). A lack of capability to explicitly identify relationships between these entities can result in huge increases in short term risk exposure at the worst possible time, as rapidly deteriorating business environments require even stronger change management ability.



If your wife is a researcher in medical entomology, you’ll often hear odd tidbits related to mosquito-borne diseases. For instance, did you know how cute malaria parasites can look under a microscope? I didn’t either, until I met Cassandra Urquhart. (Some other things I’ve heard described as “cute” since then include, but are not limited to: cockroaches, nematodes, spiders, earwigs, and male mosquitoes.) She’s fascinated by her own work with La Crosse virus, excited by new papers on dengue fever, and interested in how many of the mosquitoes she’s collected at sites around Knox County, Tennessee will test positive for West Nile virus. In her spare time, she reads books on the history of yellow fever and Chagas disease for fun. Don’t get me wrong—she cares about the human toll of such diseases. But as a scientist, she’s usually more curious than alarmed about them. However, when it comes to chikungunya virus, my cheerfully bug-obsessed wife gets far more serious—and so do many entomologists. So why is chikungunya different?

Chikungunya virus seems, at first, to have a lot in common with dengue virus, another mosquito-borne pathogen. Both cause extremely painful diseases—chikungunya’s name comes from a Makonde word meaning “that which bends up,” referring to the contortions sufferers put themselves through due to intense joint pain. Dengue’s nickname is breakbone fever. Both viruses are primarily transmitted by the Aedes aegypti mosquito, and both have been moving slowly closer to the United States over the past decades, with local cases of dengue fever already found in Florida and Texas.

Last week the Centers for Disease Control and Prevention announced the first locally acquired cases of chikungunya in the United States. A woman in Miami-Dade County and a man in Palm Beach County, neither of whom had left the country recently, both came down with the dreaded disease.



For the past few years one of the BSI committees has been working to develop a guidance standard that can be used by organisations to better direct, inform and support their Organizations and positively impact on its resilience.
The Standard known as “BS 65000:2014 Guidance on organizational resilience” has challenged the author group and been through extensive revisions before finally getting to the Public comments stage. 
The Draft for Public Comment (DPC) is open and your feedback is invited (closes on the 31st July 2014).
This process is part of the formal development of any new standard and allows wider review and consideration of the standard across industry and other stakeholder providing a final chance to update or address issues ahead of final publication.  
Initial review of the comments made will made undertaken in August with the resolution process being completed in September. Although dependent on the feedback and subsequent approval of the standard publication is expected towards the end of the 2014.
Members of the Continuity Forum Organizational Resilience Working Group can make submissions directly to us or choose to use the BSI Public Review system that can be found at:     

There is a view that senior members of organisations are the ones who are the strategists, the shapers of the future and those who are responsible for the developments in industry and professions who form the direction of various sectors.  They may well be; but in continuity, security, crisis and emergency management – they probably aren’t.  The conflation of seniority in an organisation with the assumption that there is associated strategic capability is common, but where hierarchies are populated at the top end by those who have got there by a combination of luck, ruthlessness, ‘dead man’s shoes’, or any other combination of assumed capability, the reality is different.

In the world of ‘resilience’ (and for today I am combining those specialisms mentioned above – amongst others – under that term), strategies should be driven – but often aren’t -  not by senior managers and directors, but by those who are able to think, consider and plan for the future.  Resilience is necessarily reactive to what has happened previously and in essence is about trying to reduce the impacts of future recurrence.  And to consider the wisdom of strategists we can look at any number of examples from recent years and think about why those with authority and power, kudos and seniority can’t strategise their way out of a paper bag.



No-one likes to feel that they have no control. It’s demoralising to think that someone or something else directs and influences your destiny, your future and everything involved in it.  Of course, you can help yourself by learning to be assertive or adopting a particular approach to life that allows you to regain some of the control that we all risk losing in life today. However, there are many variables that influence the way our lives turn out; and to me it makes sense to reduce those variables as much as we are able.

It does seem strange that so many of us leave it quite late in life to understand that we need to take control and determine our future; that there are some aspects of life that we can affect on our own initiative and with determination.  I meet a lot of people who, for example, have ‘never had the time’ to study the subject that their job involves, or who say ‘I’ve realised that I need a qualification’, when both knowledge about their business and evidence about that knowledge are both key elements of development and progression.  Without them, there are gaps in capability that you cannot fill – and they are therefore filled by someone or something else – and if that happens to you, you do not have control over either your, or your organisation’s future.



MONTGOMERY, Ala. – The backbreaking work accomplished by volunteers in Alabama  following the April 28 through May 5 severe storms, tornadoes, straight-line winds and flooding seems to have occurred out of the clear blue sky.

  • More than 25 Amish men traveled 70 miles to help a Madison County farmer clean up debris and help fix her home. They asked for nothing in return except a hot meal.
  • Nearly 100 volunteers showed up over a recent weekend to cut and remove 25,000 cubic yards of debris in Bessemer. But that’s just a drop in the bucket – one month after the disaster, volunteers had removed nearly 80,000 cubic yards of debris. All these volunteer’s wanted was a “thank you.”
  • In Coxey, Samaritan’s Purse, a Christian service and relief organization, brought in 471 volunteers who put in 5,900 hours in just three weeks. Also there, a local church was transformed into a storm relief center and overflowed with donations of clothes, food, personal hygiene items, cleaning supplies, and pet and baby items for survivors. The look on survivors’ faces was ample payment for these workers.

Every year and in every disaster, volunteers fill an often overlooked role and seemingly arrive and leave the scene at just the right time. A further look will reveal a network of agencies choreographing volunteer groups with seamless precision to fill the gaps that the federal government cannot. They are called Long Term Recovery Committees or LTRC.

Charles “Larry” Buckner serves as a Federal Emergency Management Agency volunteer agency liaison in Alabama to help coordinate these efforts and provide advice. He also reviews benefit requests to make sure there is no duplication.

“As far as we know, there is $4.2 million in unmet needs in home repair in all nine designated counties in this disaster,” Buckner said. “Of these counties, seven have set up Long Term Recovery Committees, some of which had just barely shut down because of the tornadoes from 2011.”

The two remaining counties have not had LTRCs in the past but are now forming them.

While FEMA and the state can and have helped survivors, neither the federal nor state governments are empowered by law to make disaster survivors whole, that is, to fully replace all that is lost.

LTRCs pick up where FEMA leaves off. Their goal is to identify and meet as many reasonable needs as possible.

These committees are the boots on the ground determining what unmet needs exist. They, in turn, work with state Voluntary Organizations Active in Disasters and other groups to attain what is needed, whether it is cash, workers or donated materials.  

The committees are everywhere across the country, Buckner said. The concept has been in existence for more than 18 years.

These committees are made up from a variety of organizations – church denominations, local charities, community foundations and some independent groups, such as nondenominational “mega churches.” The one feature they all share is a calling to help serve those in need.

“United Way is providing case workers in some counties and may act as the fiduciary, the American Red Cross may provide case workers as does the Salvation Army,” he added.

In Alabama, Buckner said the LTRC committees are working with Serve Alabama, part of the governor’s office, and has applied for a grant to be used to hire case workers.

“With the grant, they can hire 12 case workers for 18 months,” he said. “It asks for just shy of $1 million.” If approved, the grant will come from FEMA, he added.

The case workers meet with survivors and assess their unmet needs. They take into account what FEMA provided, but FEMA grants are capped at $32,400 per household. Anything beyond that amount is where the LTRC committees can assist.

The case worker will make a recommendation to a group of three to five committee members “in such a way that the board sees the facts but may never know who that individual is,” he explained.

“That is done to prevent favoritism or being passed over based on who the survivor is,” he said. “Then, the group gives a thumb’s up or down to entirely or partially meet the unmet need. You won’t see them replacing a swimming pool, but they may replace house siding and decide to paint it as well.”

While this is going on, other members of the LTRC are working to recruit volunteer organizations such as Habitat for Humanity, the Mennonites and others to come in and repair or rebuild homes. Still others are securing grants large enough to meet most, if not all, of the unmet needs.

“The dollars can go into the millions,” he said.

And any excess funding all goes to meet the needs of the survivors.

“If there is a surplus, they use the money to replace furniture, appliances and other things that will help people get back on their feet.

 “They want to provide people with safe, sanitary and functional homes,” Buckner said. “In some areas of the country they are not as successful. But they are here because the southern culture dictates that communities take care of their own.”

While no state is ruled out of the possibility of experiencing an earthquake, 42 states have a “reasonable chance” of having damaging ground shaking from an earthquake, according to recently updated information from the U.S. Geological Survey (USGS). The agency’s research also determined that 16 states — those that have experienced a magnitude 6.0 earthquake or larger — have a “relatively high likelihood” of having a damaging quake in the future.

The updated U.S. National Seismic Hazard Maps were released July 17 to reflect current understanding of where future earthquakes will occur. The data reflected what researchers have known: The earthquake hazard is highest on the West Coast, intermountain West and in regions of the Central and Eastern U.S., including near New Madrid, Mo., and Charleston, S.C. “While these overarching conclusions of the national-level hazard are similar to those of the previous maps released in 2008, details and estimates differ for many cities and states,” reported the USGS. “Several areas have been identified as being capable of having the potential for larger and more powerful earthquakes than previously thought due to more data and updated earthquake models.”

“What we’re doing is trying to forecast future shaking based on past behavior,” said Chuck Mueller, a research geophysicist with the USGS.



Congratulations! We’d Like You to Implement a Business Continuity Program in our Organization

Picking up the pieces and starting a business continuity program takes finding a BC mentor to a whole new level

We last spoke about finding one or several subject matter experts to help you understand a bit more about business continuity, disaster recovery and crisis management in your organization. Your inquisitiveness and understanding in these areas has brought you to the attention of management and perhaps positions you to be the best candidate to continue / restart BCP efforts. You’ve become a business continuity planner!

While simple interest doesn’t necessarily get you promoted, maybe your experience in Information Technology, business operations, Audit / EDP Audit or facility management qualifies you earn the confidence needed to support your BCP efforts.



As Business Continuity continues its growth as a profession, the idea of certification and the membership of professional bodies are more frequently discussed at all levels of the organization – from those starting out their career in the industry, right up to the Board Room.

As an individual you will be looking at the long term development of your career while those at Board level need to consider the long term growth of the organization. Of course the two of these are not mutually exclusive and many managers will tell you that the best way to grow an organization is to invest in its people.

The first step on the professional ladder is certification. Certification gives you an outward facing verification of your knowledge in that discipline. Attaining this level of qualification will set you apart from those who are not certified, who would only have knowledge of BC in their current environment.



A new report from New York State’s Attorney General details the damage to the state’s citizens and organizations from reported data breaches over the last eight years. “Information Exposed: Historical Examination of Data Breaches in New York State” attempts to illustrate the exponential growth in breaches, reports of breaches and some of the related costs, and then gives recommendations on how individuals and companies can better protect themselves.

In brief:

  • Almost 5,000 separate data breaches were reported to the AG’s office between 2006 and 2013.
  • These breaches exposed 22.8 million personal records of New Yorkers.
  • The number of breaches reported annually more than tripled during the time period.
  • 2013 was a record-setting year, with 7.3 million records of New Yorkers exposed.
  • Five of the 10 largest breaches reported to the AG have occurred since 2011. These are considered “mega breaches.”



An interesting article in Fortune this morning covered a round table of security and technology experts who discussed the biggest threats to businesses. Stephen Gillett, Symantec’s chief operating officer, said there were three types of threats: script kiddies, organized crime and state-sponsored. In my opinion, he forgot a few, like hacktivism, which I think he includes with script kiddies, though hacktivism needs to stand on its own as one of the most serious threats to business operations.

The panel also raised what I think is a very important question: Do you know your company’s weakest security link? Yes, they talked about insider threats and how they are underestimated in relation to outsider threats:

It’s more likely that an employee doesn’t realize the value of the data access they have, even if they’re a low-profile employee.



Explaining just why cyber attacks and data breaches are a very real concern for business continuity professionals, a report published by ForeScout Technologies revealed that 96% of respondents who took part in their survey had experienced a major IT security incident in the last year. 39% experienced at least two incidents while 16% experienced at least five.

The IDG Connect Cyber Defense Maturity Report 2014 was the result of a study of 1600 decision makers in IT security who work for companies with more than 500 employees located in three distinct regions - the US, UK and DACH (Germany, Austria and Switzerland). The sectors that respondents worked in were finance, manufacturing, healthcare, retail and education are active.

The majority of those surveyed were aware that part of their security measures were immature or ineffective, but only 33% were very confident that they can improve the less sophisticated security checks. It was suggested in the report that growing operational complexities and threats have affected the security capacity with over 43% claiming that prevention, identification, diagnosis and resolution of problems today is more difficult than it was two years ago.

With the threat so high, as also demonstrated in the latest BCI Horizon Scan Report, organizations must ensure they have plans in place to deal with the consequence of IT security incidents should they occur. Organizations are becoming more and more reliant on technology and IT, but even if those systems malfunction, with an effective business continuity plan in place, the organization should still be able to function.

38% of executives claim that supply chain management is their main challenge over the coming year with 42% placing it at the top of their list for increased investment. Those were some of the findings of a study carried out by the Consumer Goods Forum and KPMG International. The figures were even higher for those in the retail sector with over half (51%) of non-food retailers citing supply chain management as their main challenge.

The annual Global Top of Mind survey, a poll of nearly 500 C-suite and senior executives across 32 countries, also revealed how important the digital revolution will be over the next 12 months to consumer goods and retail companies – impacting everything from business growth and supply chain management to food safety, sustainability, and data security and privacy.

Supply chains are becoming longer and more complex with many factors coming into play such as infrastructure and weather - a lot of data needs to be processed in order to make sure they are fully optimised. As the complexity increases however, so does the possibility of disruption.

It is easy to see why supply chain management is an issue when you look at the most recent BCI Supply Chain Resilience Report. This report highlighted that 75% of respondents did not have full visibility over their supply chain and that 75% experienced at least one supply chain disruption over the last year with 42% of these disruptions occurring below the tier one supplier. 15% of respondents experienced disruptions that cost in excess of €1 million and 9% experienced a single disruption that cost in excess of €1 million.

The study concludes that as supply chains become increasingly complex, greater collaboration among suppliers and retailers is needed. Companies need to achieve greater visibility beyond their tier one and two suppliers and that downstream supply chains also need to be more transparent and agile.

The 2014 BCI Supply Chain Resilience Survey is currently live and can be completed by clicking here.

Readers of this blog know I am huge Civil War buff. Growing up in Texas, I only focused on the Southern side as a youngster and while this led to a sometime myopic view of events, in my mid-20s when I did begin to study the Northern side of the war, because I had never seriously studied from that perspective an entire panorama opened up for me.

One thing that never changed however, was the disaster that befell the South from the appointment of John Bell Hood to commander of the Army of Tennessee, which opposed General Sherman’s advance into Georgia since his stunning defeat of the Confederate forces at Chattanooga and later Lookout Mountain in Tennessee in late 1863. On this day 150 years, Confederate President Jefferson Davis replaced General Joseph Johnston with John Bell Hood as commander of the Army of Tennessee. Davis, impatient with Johnston’s defensive strategy in the Atlanta campaign, felt that Hood stood a better chance of saving Atlanta from the forces of Union General William T. Sherman. President Davis selected Hood for his reputation as a fighting general, in contrast to Johnston’s cautious nature. Hood did what Davis wanted and quickly attacked Sherman at Peachtree Creek on July 20 but with disastrous results. Hood attacked two more times, losing both and destroying his army’s offensive capabilities. Over the next two weeks in 1864, Hood’s actions not only led to President Abraham Lincoln’s reelection but spelled, once and for all, the doom of the Confederacy.

I thought about the risks of appointing Hood to command when I read a recent article in the Compliance Week Magazine by Carol Switzer, co-founder and President of the Open Compliance and Ethics Group (OCEG), entitled “A Strategic Approach to Conduct Risk”. Her article was accompanied by an entry in the OCEG Illustrated Series, entitled “Managing Conduct Risk in the GRC Context”, and she also presented thoughts from a Roundtable which included John Brown, Managing Principal, Risk Segment, Financial and Risk Division at Thompson Reuters; Tom Harper, Executive Vice President-General Auditor Federal Home Loan of Chicago and Dr. Roger Miles, Behavioral Risk Lead, Thompson Reuters.



Historically, corporate Boards of Directors have held the responsibility of risk management oversight, ensuring that risk management processes are clearly defined and appropriately enacted. Their role in managing risk has been to provide guidance and leadership on matters that impact the strategic direction of a company or its public image.  In this traditional view, C-level management is left with the responsibility of actual risk assessment and mitigation, including issue resolution. But in today’s fast-paced and social-media driven world, the speed at which a risk can turn into a widely publicized issue means Board members must now provide both tactical and strategic supervision over risk management as part of their membership.

In the wake of recent financial crises, increased awareness and interest from a broader array of company stakeholders now exists. High-profile and highly reported product quality problems continue to impact multiple industries and both regulators and Boards have been forced to re-evaluate the structure and the role of their risk governance efforts. Whether required by law or not, many corporate Boards, especially (but not solely) those in the financial industry, have taken a more active role in managingcorporate risks. Regardless of regulation or stakeholder demands, an active risk management initiative at the Board level makes good business sense because each risk, whether strategic, operational, political, reputational or other, presents companies with an opportunity to build competitive advantage. The proliferation of risks in the current environment has intensified and forced companies to focus on impacts that must be avoided and opportunities that should be seized. From our point of view, the Board of today should play a direct role in the new risk environment paradigm by creating an active Board-level risk management program. Such an approach will allow organizations to transition from a position defending against risk to a more proactive approach that leverages risks as new opportunities and perhaps even advances organizations to more “blue ocean” possibilities.



Factonomy’s Robin Craib gives his view on why business continuity management tools need to be built around a genuine relational database.

Across the business continuity management marketplace we see a variety of competing solutions that stakeout various concepts from across the BCM landscape. Many of these tools help to contribute to the progression of the industry through developing concepts from best practice and helping to reduce the administrative burden.

Most business continuity management tools use a genuine relational database (RDBMS) and, whilst all companies will be eager to compete on the specifics of their features, all are aspiring to provide extensive reporting features that unlock the carefully collected data for the business continuity management system that the solution is being used to manage. In many cases, these solutions represent a process of application development that has involved significant investment in time, money and expertise; whether as newly released solutions on the market or solutions that have iterated over time using market feedback. It’s fair to characterize most solutions as looking to capture and maintain real BCM data, competitors can argue over the extent to which this occurs, but most solutions are moving towards this approach with the solution representing the data warehouse for BCM inside the organization.

There is, however, a minority of business continuity management tools that have in recent years sprung up that have circumvented this process for application development and the related investments in time money and expertise. These solutions have piggy-backed on existing content management solutions (CMS) or document management solutions in the market. Typically the approach is to re-badge the tool to identify it as a business continuity management tool and to quickly take existing menus, options and interfaces and modify them to align to aspects of the BCM lifecycle.



FEMA has announced that the ISO 22301 business continuity standard has been accepted as a PS-Prep standard and two of the previously adopted PS-Prep standards have been removed, after being retired by the standards development organizations that originally developed them.

The retired standards are:

  • National Fire Protection Association (NFPA) 1600: Disaster/Emergency Management and Business Continuity Programs, 2007 and 2010 editions.
  • British Standard (BS) Institution 25999-2:2007 Business continuity management Part 2: Specification.

The situation now is that PS-Prep recognizes three business continuity standards:

PS-Prep program information and references will be updated to reflect these changes.

More details on PS-Prep.

Sources: ICOR and FEMA

So, I was recently helping a colleague prepare a management presentation to discuss her plans for advancing the business continuity program in her company.  Maybe it’s just a matter of semantics, but we had a lengthy discussion over “objectives”, “goals” and “tasks”.

If you have read any of my recent blogs you might recognize a pattern in which I think business continuity planners have become victims of our own methodology.  This discussion helped me to emphasize that point.  When I suggested to my colleague that she should first succinctly define her objective, she merely listed the steps of the methodology.  I strongly disagree.

A business continuity planner’s objective is not to complete the BCP methodology.  The methodology is simply a recipe towards achieving an end.  What is that “end” you hope to achieve?  That “end” is your ultimate objective.



(MCT) — Efforts to put in place an earthquake warning system for the West Coast gained ground Tuesday as a congressional committee recommended the first federal funds — $5 million — specifically for the project.

Its prospects remain shaky, however.

Election-year fights over other issues could keep Congress from completing work on its spending bills.

Still, the warning system enjoys bipartisan support.

"It's critical that the West Coast implement an earthquake early-warning system that will give us a heads up before the 'big one' hits, so we can save lives and protect infrastructure," said Rep. Adam Schiff (D-Burbank), who led a group of a West Coast lawmakers in seeking the funding.



Wednesday, 16 July 2014 13:44

Typhoon Rammasun heads for Manila

According to catastrophe modelling firm AIR Worldwide, with sustained winds of 157 km/h (~98 mph), Typhoon Rammasun made landfall in the Philippine province of Sorsogon July 15, late afternoon local time, and headed toward Manila. Although smaller and much less intense than deadly and highly destructive Super Typhoon Haiyan – which devastated parts of the Philippines in November 2013 – Typhoon Rammasun nonetheless prompted sizable evacuations and resulted in some disruption of transportation, as well as school and office closings. Widespread damage is not expected, but some areas could experience storm surge flooding, flash flooding, and/or mud slides, as well as wind damage.

“Rammasun rapidly intensified in the 12-hour period prior to landfall, with its central pressure decreasing from 975 to 945 mb and maximum sustained wind speeds increasing from 120 to 157 km/h (~75 to ~98 mph), according to JMA intensity estimates,” said Dr Kevin Hill, senior scientist at AIR Worldwide. “At landfall, Rammasun featured the well-defined eye and symmetric eyewall, indicative of a strong typhoon. Rammasun is not a significant threat to areas affected by Typhoon Haiyan (2013).”

Typhoon and flood damage are usually covered together in the Philippines and are given under separate fire policies with named perils extensions. Insurance penetration varies by region. Typhoon Rammasun will affect some densely populated and urban areas, including Manila, where insurance penetration for residential lines would be around 5-10%, 25-30% for commercial/industrial. Still, given that insurance penetration in this area is around 10% to 20%, insured losses are not expected to be significant as a result of this typhoon.



“Be your own hero.” The words hit me like a bolt of lightning (I’ll explain why later). I was at my office having coffee with a colleague and discussing the changing (declining) status of the world we live in. We started talking about the rash of random shootings and how they have, ironically, improved our nation’s geographical knowledge. Very few people had ever heard of Aurora, Colo.; Isla Vista, Calif.; or Newtown Conn., before deranged gunmen decided to take their personal grudges out on innocent people. Now the names evoke equal parts sympathy and infamy in the minds of many Americans.

The conversation then turned to how to use these situations to encourage general preparedness. For example, countless times people have gone to the movies and texted or talked their way through the “Please take time to make note of the nearest exit” message displayed before the feature presentation. After the shooting in Aurora, however, patrons were discreetly and independently scoping out the nearest exits well before being prompted to do so.

Why the shift?



While Texas' Republican leadership touts the state’s booming economic growth, Texas-based climate scientists — some of the world’s most renowned — say that growth has come at a high cost.

In coming decades, the state is projected to be several degrees warmer and see longer and more severe droughts. Regions that already receive little rainfall will probably become drier, and portions of Texas’ 367-mile Gulf Coast should see rising seas, leaving them more susceptible to storms — which could also become stronger due to warming oceans.

Yet despite these forecasts, Texas remains one of the most significant contributors to global warming in the world. Year after year, Texas spews out more greenhouse gases than any other state in the country, and much of its growth is due to an energy boom that relies on extracting more carbon-polluting fossil fuels.



Tuesday, 15 July 2014 20:41

Prep Check! Lockheed Martin

Prep Check Banner

Have you ever considered what you would do if you were out and about and severe weather struck? Where would you find shelter?  Would it be safe to try to go home? 

2008 Atlanta tornado damage

Photo courtesy Atlanta-Fulton County EMA

You can’t control where you will be during an emergency.  In March 2008, a tornado struck downtown Atlanta damaging several buildings and interrupting an SEC game at the Georgia Dome and an Atlanta Hawks game at Phillips Arena.  In April 2011, a tornado hit Lambert-St. Louis International Airport sending passengers scrambling for cover as debris swirled in the air around them.  Planes, with passengers in them, were damaged, windows were knocked out, and the terminal was shut down for temporarily for repairs. 

Disasters can strike any place at any time and it’s important to learn how to be safe wherever you are.

CDC’s Prep Check! takes preparedness into the community.  Each episode features a venue that many people visit regularly – large businesses, airports, churches, and more – to learn about how each of them prepares for a disaster. At the venue, we talk to emergency planners about their preparedness activities to protect employees and visitors, and experiences they have had with emergency situations.

In our first episode, we visit Lockheed Martin, one of the world’s largest defense contractors with more than 100,000 employees.  Lockheed Martin is a global security and aerospace company with a heavy footprint in research, design, development, manufacture, and sustainability technology systems and products.  In this episode we talk to three of their emergency planners about their preparedness activities.  Go behind the scenes with Prep Check! and you’ll probably discover that you know more about this company than you realize.

Stay tuned to see how other locations prepare for emergencies and learn what you can do to keep yourself and your family safe.

Overcapacity in the international construction, property /casualty markets in the first half of 2014 has resulted in rate reductions of up to 30% for commercial insurance buyers, according to Willis.

This is primarily driven by benign loss activity and softening conditions in the global reinsurance market, which is having a trickledown effect to the primary insurance market, according to Willis’s Q3 2014 Construction, Property & Casualty Market Review. Over and above rate reductions, corporate insurance buyers are also benefitting from an increase in available natural catastrophe capacity.

With no withdrawals of capacity from the construction market in the last six months, capacity is at an all-time high, according to the report. At the same time the volume of construction projects in many parts of the world has reduced, intensifying competition between carriers for premium volume and market share in the construction insurance market.



When creating Evacuation, Incident Management or Business Continuity Plans the focus is usually on what you will do, how you will react and what actions you will take.  Unfortunately, those assumption (yes, those really are ‘assumptions’) don’t necessarily mesh with what may really happen.

First, consider your environment.  Are you in a multi-tenant building?  Your landlord (or their Building Management staff) has responsibility for the safety of all tenants – including you – and the preservation of property (your and theirs).  Your Evacuation Plan must mesh with those of the other tenants.  If not, the result of an evacuation may be chaos, with tenants vying for the same assembly points.

In a single tenant building you don’t own, the landlord/building manager often has the same responsibilities (check your lease).  Once outside the building, you may no longer have the authority to make decisions about when and how to return.



Microsoft has announced that it has acquired InMage, an innovator in the emerging area of cloud-based business continuity.

Explaining the acquisition in a blog, Microsoft states:

“Customers tell us that business continuity – the ability to backup, replicate and quickly recover data and applications in case of a system failure – is incredibly important. After all, revenue, supply chains, customer loyalty, employee productivity and more are on the line. It’s also very complicated and expensive to do. CIOs consistently rank business continuity as a top priority, but often don’t have the budgets or time to do it right.”

“As the productivity and platform company for the mobile-first, cloud-first world, Microsoft is committed to solving this challenge for customers. This acquisition will accelerate our strategy to provide hybrid cloud business continuity solutions for any customer IT environment, be it Windows or Linux, physical or virtualized on Hyper-V, VMware or others. This will make Azure the ideal destination for disaster recovery for virtually every enterprise server in the world. As VMware customers explore their options to permanently migrate their applications to the cloud, this will also provide a great onramp”



The Business Continuity Institute (BCI) has announced that it has been named ‘Most Respected Training Resource of 2014’ at the prestigious Business Excellence Awards.

Voted for by a worldwide network of professionals, advisers, clients, peers and business insiders, the Acquisition International Business Excellence Awards celebrate individuals and organizations whose ‘commitment to excellence sees them exceeding clients’ expectations on a daily basis while setting the bar for others in their industry’.

Deborah Higgins, Head of Learning and Development at the BCI, said: “To be named Most Respected Training Resource of 2014 is a wonderful recognition of the dedicated and coordinated effort from BCI staff, BCI volunteers and the network of BCI Training Partners and Instructors who have invested in developing and delivering a world class learning experience. Winning this award will spur us all on to continue providing the highest levels of service and meet the challenges of accessing new and developing markets and promoting the profession of business continuity.”


In the beginning, experts said Big Data technologies would lead to the end of enterprise data integration, with data eventually moving into one, big in-memory or Hadoop system.

That was before the Internet of Things (IoT), with its never-ending stream of data. It seems the IoT is teaching Big Data humility.

A TechTarget article about a recent O’Reilly Media webcast includes this very telling quote from Mike Olson, co-founder and chief strategy officer for Hadoop distributor Cloudera: "It turns out machines are much better at generating data than you or I. It's why big data is happening; it's why industry is so quickly being transformed."



Tuesday, 15 July 2014 20:32

How to be a Liaison

We talk about it, write about it, have it on our EOC organization charts, but what does it really mean to be a liaison? What are the best ways to use these people and positions?

My first military assignment was as an infantry officer serving in a combat engineer battalion. As such I supported a mechanized infantry battalion when they were on field maneuvers at Fort Hood, Texas. In that era we spent half of our time in the field so I got lots of experience in being a liaison in another organization’s command post. Yes, the principles are all the same.

The primary goal is to have eyes and ears on what is going on. Disasters are fluid, and discerning the situation and its ramifications is not easy. By having a person in another organization’s EOC or other facility physically, you have the ability to measure what is happening and the pace of the activity. And you have to discern if you will be providing resources or receiving them.



In recent months, as California officials started to calculate the fire danger posed by the state’s prolonged and historic drought, they tucked an extra $23 million into the Cal Fire emergency wildfire budget for the fiscal year that began July 1, bringing its total to $209 million.

By July 6 – just days into the fiscal year – the agency already had spent $13.9 million battling two major blazes, and is now bracing for one of the longest and most difficult fire seasons in memory.

“That’s just the first week, and we still have 51 more weeks to go,” said Daniel Berlant, spokesman for Cal Fire, the California Department of Forestry and Fire Protection. “We’re not even to the peak of the fire season yet.”

Berlant and top fire officials have been warning for months that the state faces serious peril from wildland fires this year, as the drought – stretching into a third year – has sucked dry much of the state’s brush lands and forests more quickly than in years with more normal precipitation levels.



Emergency officials across southwest Ohio say they are confident in the region’s emergency preparedness and ability to respond in the event of a crisis.

Since May, five tornadoes have touched down in parts of Ohio — the nearest being an F3 tornado that landed May 15 in Greene County to the northeast of Butler County, said Brian Coniglio, meteorologist at the National Weather Service in Wilmington.

It’s events like that — as well as mass-casualty incidents, flooding, intense cold and periods of high flu activity — that emergency responders and hospital staff are training for in order to coordinate a quick and efficient response, said Jennifer Mason, emergency medical services and disaster management coordinator at Fort Hamilton Hospital.



Tuesday, 15 July 2014 20:19

Preparing for the Commonwealth Games

Two years on from the London 2012 Olympic Games, the UK is set to play host yet again to one of the largest sporting events in the world – the Commonwealth Games, hosted by the city of Glasgow in Scotland. Glasgow 2014 may not quite be on the same scale as London 2012, but the crowds will still be high.

On the 23rd July, and over the following two weeks, 6,500 athletes from 71 different countries will be taking part in 17 different sports for the right to win a gold medal. 2,500 journalists will be attending the events and with more than a million tickets sold, the number of additional visitors to Glasgow is expected to exceed 100,000.

So what does all this mean for business continuity planners? For many organizations events like this are a dream come true. Investment in the city in order to rebuild infrastructure over the past few years has been high with many local firms reaping the benefit. During the Games, retail outlets will do a roaring trade as the visitors spend their money on souvenirs, food, drink and, seeing as it’s the west coast of Scotland, probably a few umbrellas and rain coats.

For some organizations however, whether getting into the spirit of the Games or not, there will possibly be some disruption during the two weeks.

If you’re an employer then it’s highly likely that a few of your staff will want to attend some of the events or take leave during what is normally the holiday period. Have you taken this into consideration and made suitable arrangements?

Transport networks will be stretched to the limit as trains and roads become busier than normal. Have you made suitable arrangements to ensure your staff can get to work or perhaps work from home instead? If you work in the transport industry, are your customers or suppliers aware that there might be some delays? For such high profile events, security is always an issue and this can slow things down even further.

If you’re a retailer then the increase in visitor numbers means your stock may go quickly (that’s a good thing) but how quickly can you replace it in order to take an even greater advantage of the circumstances? With international events such as the Commonwealth Games, language can often be a barrier. English may be the common language for many of the countries competing, but there will be many other languages spoken too, do you have the ability to communicate with non-English speakers?

Let’s not forget the extra strain that will be placed on the communications network, do you rely on your mobile phone, and can you guarantee it will work when so many other people are trying to use theirs? There may be a similar issue with broadband if the network starts to reach capacity.

Of course, with all the excitement about the influx of new customers, businesses mustn’t forget their existing customers, those people who will (hopefully!) still be there long after the Games are over. Do they know what your arrangements are during the Games and have you considered ways to reduce the disruption to them?

A major event such as the Commonwealth Games brings plenty of opportunities to the host city and the surrounding area, but everything comes at a cost. If you prepare properly however, and consider what disruptions could affect your organization, then plans can easily be put in place to ensure that this cost is not high and is far outweighed by the positives.

Andrew Scott is the Senior Communications Manager at the Business Continuity Institute who joined after a brief stint working as the Press Officer for a national health charity. Prior to that he had over ten years at the Ministry of Defence working in a number of roles including communications and business continuity. During this time he also completed a Masters in Public Relations at the University of Stirling.


After potentially serious back-to-back laboratory accidents, federal health officials announced Friday that they had temporarily closed the flu and anthrax laboratories at the Centers for Disease Control and Prevention in Atlanta and halted shipments of all infectious agents from the agency’s highest-security labs.

The accidents, and the C.D.C.’s emphatic response to them, could have important consequences for the many laboratories that store high-risk agents and the few that, even more controversially, specialize in making them more dangerous for research purposes.

If the C.D.C. — which the agency’s director, Dr. Thomas Frieden, called “the reference laboratory to the world” — had multiple accidents that could, in theory, have killed both staff members and people outside, there will undoubtedly be calls for stricter controls on other university, military and private laboratories.



I wouldn’t normally write about sports on this blog …or at all, really, but here’s an unexpected development: Today, famed statistician and data geek Nate Silver revealed that he and his company, FiveThirtyEight, ran a data analysis on whether LeBron James should stay with Miami or move to Cleveland.

It may seem like an unusual mix, but sports is actually Silver’s original stomping ground. He first made a name for himself by developing the Player Empirical Comparison and Optimization Test Algorithm, a system for forecasting Major League Baseball player performance.

As you might expect, the results are a bit controversial. After all, this is the man who gave us “No, really, your polls are wrong about Mitt Romney winning.”



Things just keep getting curiouser and curiouser in the data center industry.

If it feels like enterprise IT has fallen down the rabbit hole in this age of virtualization and cloud, well, it looks like we’re just getting started. But not all the changes are taking place on the abstract, architectural level. The data center itself is undergoing substantial physical changes as organizations look for innovative ways to boost data productivity while lowering costs.

Examples abound of data centers being built in extreme climates where they can take advantage of naturally cold air or steady winds, but lately it seems that building designs themselves are starting to push an array of unusual envelopes. Take, for example, Foxconn’s latest “green-tunnel” data center, which is literally built inside a long tunnel within the Guiyang industrial park in China. The facility holds up to 12 containerized data centers each packing 504 servers. By leveraging conditions like wind speed and direction, as well as temperature, humidity and geology, the facility is expected to cut power consumption by a third.



It’s hard to wrap your mind around the fact that someone would enter a school building and declare open season on kids. It’s even harder to determine a strategy for how to mitigate that. There’s a growing catalog of “solutions” to help with the problem.

There are a number of trainings available, including the Run, Hide, Fight video and ALICE (Alert, Lockdown, Inform, Counter, Evacuate) training; there’s the mental health issue; the gun issue; there are myriad solutions — buzzers, cameras, locks, bulletproof desk tops — and we discuss some of these and their relative merits in Active Shooter Mirage (renamed Are Schools Focusing Too Much on the Active Shooter Scenario? for online publication).

It seems school districts are grasping at straws, trying to come up with a fix, including investing millions in some cases on security measures like cameras, which by themselves won’t stop a gunman bent on destruction.



The Business Continuity Institute is delighted to be named Most Respected Training Resource of 2014 at the prestigious Business Excellence Awards.

Voted for by a worldwide network of professionals, advisers, clients, peers and business insiders, the Acquisition International Business Excellence Awards celebrate the individuals and organizations whose commitment to excellence sees them exceeding clients’ expectations on a daily basis while setting the bar for others in their industry. They are given to only the most deserving businesses, departments and individuals who have consistently demonstrated outstanding innovation, performance and commitment to their business or clients over the past 12 months and who have received independent nominations from their clients or industry peers.

Deborah Higgins, Head of Learning and Development at the BCI, said: “To be named Most Respected Training Resource of 2014 is a wonderful recognition of the dedicated and coordinated effort from BCI staff, BCI volunteers and the network of BCI Training Partners and Instructors who have invested in developing and delivering a world class learning experience. Winning this Award will spur us all on to continue providing the highest levels of service and meet the challenges of accessing new and developing markets and promoting the profession of business continuity.”

Speaking about the awards, AI Global Media awards coordinator Siobhan Hanley said: “Our Business Excellence Awards are quickly becoming one of our most popular, with businesses all over the globe eager to showcase the amazing work they’ve been doing to achieve stellar results for their clients while really setting the standards for what can be achieved in their sector. We’re proud to be able to showcase some of the most innovative and committed organizations from across the business world and the winners can be rightly proud of the game-changing work they’ve been doing over the past 12 months.”

To find out exactly which businesses have gone above and beyond this year, achieving outstanding results for their clients while demonstrating unwavering commitment to providing the best possible service, visit the AI website where you can view the winner’s supplement.

Based in Caversham, United Kingdom, the Business Continuity Institute (BCI) was established in 1994 to promote the art and science of business continuity worldwide and to assist organizations in preparing for and surviving minor and large-scale man-made and natural disasters.  The Institute enables members to obtain guidance and support from their fellow practitioners and offers professional training and certification programmes to disseminate and validate the highest standards of competence and ethics.  It has circa 8,000 members in more than 100 countries, who are active in an estimated 3,000 organizations in private, public and third sectors.

For more information go to: www.thebci.org

Here in Alabama, residents are no strangers to natural disasters.  Civic histories of many cities and towns throughout the state include references to natural disasters such as fires, tornadoes and hurricanes.

Alabamians know they must be prepared.  Every home should have a smoke alarm; every home should have an emergency supply kit packed and ready.

What not everyone realizes, however, is that being prepared doesn’t have to cost a lot of money.

The Federal Emergency Management Agency’s disaster preparedness website, www.ready.gov is a destination site for information about getting your family prepared for a disaster.

“FEMA urges residents of every community in every state to Be Informed, Have a Plan and Prepare a Kit,” said Albie Lewis, federal coordinating officer for the Alabama recovery. “Each of these may be critical in a family’s ability to recover from disaster.  A family preparedness kit, particularly, is one of the most important tools at your disposal to keep your family safe in a disaster.”

Commercially available disaster kits can range from $75 to $300 and up, but most of the pieces of a disaster kit already may be in the home and just need to be gathered together and stored in one place.

“The rule of thumb for residents who are survivors of a disaster is that they should be prepared to take care of their family’s needs for the first 72 hours after a disaster strikes,” says Art Faulkner, director of Alabama’s Emergency Management Agency.  “It may take that long for responders to get to you.”

FEMA recommends that an emergency preparedness kit include food and water for each member of the family for three days, a battery-powered or hand-crank radio, flashlight, spare batteries, first aid kit, non-electric can opener, local maps and personal sanitation items such as hand sanitizer, moist towelettes, toilet paper, garbage bags and plastic ties.

Water supplies should be sufficient to meet both health and sanitation needs.

Family emergency kits also should include important family documents such as wills or property deeds, personal identification and any prescription medicines a family member may be taking.

Other items to consider include sleeping bags or blankets, paper towels, books, puzzles and games for children, pet food and medications for family pets.

It’s helpful to have cash in case banks are closed and there is no power for ATMs.

The emergency supplies can be stored in an easy-to-carry plastic storage container or duffel bag, making them easy to grab and go when an emergency forces people to leave their home.

Rene Bertagna ran a northern Virginia restaurant called the Serbian Crown for 40 years. It attracted Washington, D.C. diners with unusual fare such as horse, lion and kangaroo meat. For 40 years, his restaurant was a dining destination in and of itself.

Bertagna blames the Internet, and specifically Google, for its closure last year, according to a July Wired article. He sued Google over the Serbian Crown’s erroneous listing on Google Places, which listed the restaurant as closed on weekends when, in fact, weekends constitute the bulk of the restaurant’s business. He and his attorney contend a hacker created the error, but that Google was unresponsive to his phone calls asking to change the listing.

This problem isn’t as unusual as you’d like to think. Wired offers many other examples, and quotes Mike Blumenthal, a consultant who helps fix listings and who blogs about Google gaffs on his own site.



I follow quite a few small to midsize business (SMB) accounts on Twitter, and noticed that many this week had joined a chat about data privacy for small business (#chatDPD). The topics ranged from the Internet of Things (IoT) to what SMBs know about data privacy.

One tweet in particular caught my eye. It was from AT&T Small Business (@ATTSmallBiz) and it said “Security & privacy must work together, but privacy includes how data is used by your biz and vendors.”It struck a chord with me because I recall a recent event where AT&T found that a breach in its data systems was caused by a vendor whose employee accessed accounts “without authorization.” Of course, I’m sure the person Tweeting was aware of the instance, but their tips and views on the privacy chat definitely hold true for both large enterprises and SMBs.

One other thing @ATTSmallBiz pointed out was how SMBs may have policies to guard against cybersecurity issues, but they may not be as detailed or strong as they should be. Also, small businesses may not have IT staff to reinforce such policies. @ATTSmallBiz said:



What do Sayada, Tunisia, and Red Hook, Brooklyn, have in common? At first glance, not much. One is a fishing town on the Mediterranean Sea. The other is a waterfront neighborhood in an industrial section of America’s largest city. But both are using a networking technology that is cheap, relatively easy to set up, and remarkably resilient and secure.

Called a mesh network, the technology lets users connect directly to each other rather than through a central hub. For the citizens of Sayada, that means they can create a community network free from government surveillance or interference. For residents of Red Hook, the local mesh network helps them stay connected during power outages.

Of course, mesh networks aren’t new. They’ve been operating in Europe for years. They are, however, relatively new to the U.S., where they are just starting to catch on. In Detroit, where some neighborhoods don’t have access to broadband, mesh networks are seen as a low-cost solution to the digital divide that exists there. And for many local governments, mesh networks are a relatively simple way to offer high-speed Wi-Fi. Ponca City, Okla., has adopted mesh as a means of delivering free wireless broadband to all of its 25,000 residents.



Let’s face it: Whether or not policies are in place to prohibit it, business units frequently  circumvent the IT department and go out on their own to source the IT products and services they feel they need to stay competitive. So when that happens, who’s really at fault—the business unit, or the IT department?

I recently discussed this topic with Kent Christensen, virtualization and cloud practice director at Eden Prairie, Minn.-based cloud services provider Datalink, who sees the circumvention all the time.

“It’s kind of a given,” Christensen said. “Every organization knows it’s either happening, or somebody has a desire for it to happen.”



Many organizations fail to acknowledge that the scenario most likely to cause a business disruption is an electrical outage.  Without power, everything can grind to a halt.

A sudden loss of electrical power can result from weather, mechanical malfunction, human error or any number of other less common causes (sabotage, solar flares, etc.).  Minutes or days may pass before power is restored.  What should you do to prepare?

Create a Power Outage Policy

A policy may take the form of “How long will we wait before we let everyone go home?”  That’s practical, but not a very effective Business Continuity strategy.  Or make dismissal decisions based on time-of-day: if the RTOs (or MAD) for local business processes are greater than the hours remaining in the workday, everyone goes home.



BCM experts and practitioners offer insights to raise the profile and relevance of business continuity professionals

PLYMOUTH MEETING, Pa. Strategic BCP®, a team of business continuity planning (BCP) and management specialists, has announced the availability of its new blog featuring expert content on the topics that help streamline BCP for enterprise resilience and that raise the profile and relevance of business continuity (BC) professionals across their organizations.

The blog acts as an open forum to share ideas that are driving and challenging BCM strategies today. Its content will be comprised of insights authored by Strategic BCP contributors and guest bloggers, tapping into the vast industry knowledge and experience as hands-on consultants and as managers of BC, disaster recovery (DR), and information technology (IT).

Topics will offer best practices, lessons learned, and real-world success examples. Current BCP software considerations, processes, and compliance standards will also be discussed.

Our bloggers currently include:

  • Frank Perlmutter (CBCP, MBCI): Founder of Strategic BCP & Former DR/COOP (Continuity of Operations Planning) Manager for the U.S. Department of the Treasury

  • Dave Olkowski (CBCP, MBCI): Senior Manager & Former BC Analyst at MBNA America Bank

  • Cherie Taylor (CBCP): Senior Manager & Member of the Business Continuity Planners Association (BCPA) Board of Directors

  • Chris Duffy (CISSP): Senior Manager & Former CIO at Peirce College in Philadelphia

As colleagues with common goals, there’s no shortage of information to be shared given how complicated this industry can be,” says Kimberly Lawrence (ABCP), Vice President and Business Continuity Program Manager at Umpqua Bank (formerly with Sterling Bank before the merger). “Unbiased viewpoints from real practitioners can help both newcomers and even seasoned pros who are responsible for BC planning.”

Some recent posts include:

To read additional posts, visit http://www.strategicbcp.com/blog. If you are interested in contributing to the blog or collaborating on content, email jsolick@strategicbcp.com.

About Strategic BCP

Strategic BCP® represents a team of business continuity management specialists who empower organizations of all sizes to build cost-effective, action-based plans that can be implemented immediately in the event of downtime. The company’s award-winning BCM software, ResilienceONE®, integrates risk assessment and management, BC plan development and maintenance, incident management, and compliance issues in one comprehensive easy-to-implement solution. It features proprietary algorithms and metrics that automate cumbersome tasks and provide comprehensive insight into an organization’s risk profile. Strategic BCP complies with the U.S.-EU Safe Harbor Framework and the U.S.-Swiss Safe Harbor Framework. More information: www.strategicbcp.com.

A new report by EEF, the manufacturers' organization, warns the UK Government to act over escalating risks to the UK's supply of essential materials. It says that the global growth in middle-class consumers, increased demand for all commodities and an over-reliance on China for strategic supplies, is leaving the UK vulnerable. But, while other manufacturing nations have strategies in place to shield their economies from resource risks, the UK is lagging behind.

The report ‘Materials for Manufacturing: Safeguarding Supply’ digs behind concerns raised by UK manufacturers that volatile material prices and security of supply pose a threat to growth and confirms that the UK does indeed face escalating risks.

Globally, the consuming middle classes are expected to swell from 1.8 billion people to 4.9 billion by 2030. Demand for all commodities is expected to rocket by 30 to 80 percent by 2030. However, the UK's supply of essential materials – ranging from silicon metal and rare earth elements through to coking coal - is concentrated. China is the leading supplier of materials to the UK, producing 22 of 38 elements of strategic economic value. These are minerals and metals that are vital to British manufacturing.



Thursday, 10 July 2014 15:55

BCI Diploma – Good Reasons

The BCI Diploma is the unique and only BC award that provides a route to Institute membership on one hand, and a significant development in confidence, capability and subject understanding and knowledge in those who are successful in achieving it on the other.

The designation DBCI shows that the holder has gone the significant extra distance and studied BCM in depth, looking far beyond frameworks and simple guidance, and researching the subjects related to continuity, resilience and associated issues in significant depth.  The DBCI also indicates that the holder has the potential to succeed at postgraduate level, and we have several graduates from the Diploma now enrolled on our MSc Organisational Resilience.



Thursday, 10 July 2014 15:54

Disaster Recovery Lessons from Radiology

When hospitals moved from film-based hardcopy systems to electronic images, they began to generate large amounts of data held on PACS – Picture Archiving and Communications Systems. Hospitals use various ‘modalities’ to scan patients, including Computer Tomography, Magnetic Resonance Imaging and Ultrasound systems. These modalities must regularly (and frequently) upload the scanned images to the PACS, where they can be stored, sequenced for retrieval and made available for remote diagnosis. However, a PACS is often a potential single point of failure with inevitable downtime – which is where the DR lessons start.



School shootings have captured the attention of the American public and certainly school administrators, who feel compelled to do something to prevent or mitigate the effects of a similar incident taking place on their grounds.

Solutions — in the form of cameras, metal detectors, buzzers, bulletproof white boards and the like — are coming out of the woodwork and are being foisted upon administrators. There is a lot of training available too, such as the Run, Hide, Fight video that demonstrates what to do in the event of an active shooter, including taking down an armed gunman. 

But there are problems with these approaches and educators are missing key elements of managing these scenarios by relying on some of the technology fixes and the active shooter training, some experts say.

The Run, Hide, Fight training is an alternative to waiting for law enforcement to arrive, which is ineffective since most violent acts are usually over in minutes, before law enforcement arrives. The objective of the training videos is to condition students and administrators, anyone faced with the potentially deadly situation of an active shooter, to recognize the best avenues for avoiding bloodshed.



Thursday, 10 July 2014 15:51

Forgotten Smallpox Discovered Near D.C.

National Institutes of Health workers preparing to move a lab in Bethesda, Md., found an unwelcome surprise in a storage room this month: six vials of smallpox.

There is no evidence that any of the vials was breached, and no lab workers or members of the public were exposed to the infectious and potentially deadly virus, the federal Centers for Disease Control and Prevention said in its announcement Tuesday.

The vials labeled variola — a name for the smallpox virus — were found July 1 “in an unused portion of a storage room” and seem to date to the 1950s, the CDC said. They were freeze-dried, intact and sealed, forgotten and packed away in a cardboard box, officials said.

The vials were "immediately secured" in a containment lab, then transported via government aircraft Monday to the CDC’s containment facility in Atlanta, it said.



The General Assembly of the Federation of European Risk Management Associations (FERMA) has agreed a framework and funding to create European certification for risk managers.

It will include certification of the professional competences and experience of individual risk managers as well as accreditation for the risk management programmes of education bodies. Certification will be supported by a requirement for continuous professional development and a code of ethics.

The General Assembly also agreed for funding from FERMA’s reserves for the development and implementation phase of the project, which will run on a not-for-profit basis.

FERMA vice president Michel Dennery said: “A new designation is born: European Certified Risk Manager. We can be proud of that we will be providing one of the first pan-European recognitions of a profession, and one that will be a benchmark for other risk managers’ professional bodies worldwide.”



Cyber security has been ranked third in a list of boardroom investment priorities, according to a survey released earlier this month by KPMG.

The annual Business Instinct Survey, a poll of 498 C-level executives from businesses across the UK, found under-investment has left many businesses acknowledging the need to increase spending on secure technology.

However, despite acceptance that cyber security is critical to long-term business operations, one in three executives questioned (36 percent) said investing in people skills had become their number one concern, with 19 percent also more focused on plant or machinery purchases.

According to the findings, data protection and cyber threats also ranked third behind corporate governance and regulatory change, and supply chain risk/procurement when boards considered the main risk issues influencing their approach to managing their businesses.



Wednesday, 09 July 2014 15:54

BCI Education Month Reduction

Did you know that September 2014 is BCI Education Month?  Lots of initiatives to develop educational opportunities in BCM.  Here at Bucks New University we are offering 10% off (a saving of £250.00) for those who enrol on the September cohort of the BCI Diploma.

Education Month details here: http://www.thebci.org/index.php/training-education/bci-education-month

Wednesday, 09 July 2014 15:53

New – Foundation Degree in Cyber Security

Cyber-attacks comprise the main security issues facing organisations in the Information Age. The UK Government’s National Security Strategy (first published in 2011) categorises cyber-attacks as a Tier One threat to our national security, alongside international terrorism. According to the UK Government, 93% of large corporations and 87% of small businesses reported a cyber-breach in the past year and analysis from the UK Ministry of Defence estimates the cost to the UK economy at around £11.6 billion a year.

The Government has allocated £860 million towards the UK’s national cyber security strategy to 2016 which has the four objectives of:

- making the UK one of the most secure places in the world to do business in cyberspace;

  • making the UK more resilient to cyber-attack and better able to protect our interests in cyberspace;
  • helping shape an open, vibrant and stable cyberspace that supports open societies;
  • building the UK’s cyber security knowledge, skills and capability.



Wednesday, 09 July 2014 15:52

If Tuberculosis Spreads ...

ATLANTA — DRUG-RESISTANT tuberculosis is on the rise. The World Health Organization reports around 500,000 new drug-resistant cases each year. Fewer than half of patients with extensively drug-resistant tuberculosis will be cured, even with the best medical care. The disease in all its forms is second only to AIDS as an infectious killer worldwide.

The United States has given more than $5 billion to the Global Fund to Fight AIDS, Tuberculosis and Malaria. But drug-resistant tuberculosis isn’t a problem only in the developing world; we must turn our attention to the fight against it here at home.

Tuberculosis rates have declined in the United States in the last decade. In 2012, there were around 10,000 cases, and of those, only 83 were resistant to all of the most commonly used tuberculosis drugs — 44 fewer than in 2011. So far we have been lucky. The low numbers hide the precarious nature of the nation’s public health defense, and how vulnerable we would be to an epidemic.



Big Data promises to bring big changes to the way enterprises collect, store, analyze and use their data. From increased infrastructure to new marketing usage, Big Data will affect many areas of the company. So it’s no wonder that with all that looming on the horizon, hiring managers are scrambling to fill positions opened up by the latest big technology—including software engineering jobs.

In the realm of Big Data, software engineers will be required to find ways to integrate the enormous amounts of data into programs that solve business challenges. If your company is looking to create a new division of software engineering just for Big Data, a good place to start is to hire a senior position to head up the team.

In our IT Downloads area, you will find a ready-to-use job description for a Senior Software Engineer/Big Data. The description is useful for human resources departments and hiring managers when deciding the qualifications of a senior-level software developer in this area. The job description can be used as-is, or use the information included to spur your own company to create a job description for such a position.



Much attention has been paid to the likelihood of more drought, fires and floods as the planet warms, but the most significant impact on public infrastructure won't come from extreme weather events, Paul Chinowsky says.

It will be the the change in what constitutes normal weather in various regions — higher temperatures for more sustained periods of time, higher or lower average humidity and rainfall — that will most tax buildings, roads and bridges that were built for one set of conditions and now have to function in another.

"Road surfaces get weaker in heat," Chinowsky said. "Asphalt gets softer. As trucks and cars pass, you get a lot more potholes, more cracking. It won't be a one time event but a constant thing. That's the part we don't talk about, but that's the part that's going to have a huge economic impact."



The role of local authorities is crucial in the steps towards building resilience against natural disasters due to their ability to manage risk and ensure prevention on the front line, the Committee of the Regions has argued.

The Committee – an assembly of local and regional leaders from all member states – was represented by Cllr Siggs of the European Conservatives and Reformists Group from the UK’s County Council of Somerset. His comments were made in response to a European Commission proposal that contributes to the EU's international obligations in finding a common strategy to build resilience to disasters.

Worldwide between 2002 and 2012, natural disasters were responsible for more than 80,000 deaths and the economic cost was as high as €95bn (£750bn). Cllr Siggs stated that local authorities have three clear roles in disaster management: preparing through improved resilience; reacting with improved coordination; and dealing with the impact afterwards.



Effective IT governance is a critical tool for CIOs to align their organizations and efforts to support business strategy and create shareholder value. Given the rapidly changing and evolving technology options that confront CIOs and business leaders, making sure the right decisions are being made about investments in IT is an essential priority.

There are many misconceptions about what constitutes a comprehensive IT governance model and how it is implemented. IT governance is more than just:

  • Having a steering committee that meets periodically to review and approve IT plans and budgets
  • Involving the business on an annual basis to assist in assigning IT priorities
  • Using financial metrics such as ROI to determine whether to invest in specific initiatives
  • Instituting best practices to ensure projects are completed on time and within budget
  • Measuring and reporting on user satisfaction of IT services



Tuesday, 08 July 2014 16:19

ASOS disaster recovery response praised

The recent fire at the distribution centre of leading British online retailer ASOS is a textbook example in the importance of having an effective disaster recovery plan in place across your organization’s supply chain, in order to ensure business continuity says, Jonathan Gibson, Head of Logistics at supply chain consultancy firm Crimson & Co.

The incident, which occurred in late June at ASOS’s main distribution centre in Barnsley caused damage to 20 percent of the retailer’s stock, and consequently required the business to temporarily cease trading. Despite this the online retailer made an efficient recovery and was operational again in 48 hours. Gibson states that the impressive recovery strategy is an eye opener to fellow retailers, demonstrating the importance of implementing a structured plan that is able to identify risks across your business.

“The ASOS warehouse fire brings home the importance of having backup and disaster recovery processes in place across your organization’s supply chain. Ultimately, consumer’s sympathy for an incident such as this will only go so far, and if you are offline for a significant amount of time customer loyalty will waiver and they will start to look elsewhere.



Digital Realty Trust, Inc. has released Australia-specific findings following its annual commissioned survey of Asia Pacific data centre trends conducted by Forrester Consulting.

According to the survey, 76 percent of Australian organizations expect to increase spending on data centre facilities over the next 12 months, with 59 percent of respondents expecting to increase spending by 5 – 10 percent and 17 percent of respondents expecting to increase spending by more than 10 percent.

Big Data was cited as the key driver of data centre growth in Australia by over half (51 percent), followed by virtualization (39 percent) and business continuity (37 percent).
Additional findings from the survey include:

  • CIOs continue to have the strongest influence on data centre spend in Australia with over half (52 percent) of respondents identifying the CIO or most senior IT decision maker as influencing the decision, closely followed by the CEO (46 percent) and the IT VP/manager/director (46 percent).
  • Over half (52 percent) of Australian organizations surveyed have between one to four data centres.
  • Exactly half of respondents (50 percent) cited the need to expand space and number of cabinets/racks as the main reason their data centre facilities are running out of capacity.


For some organisations, it’s an explicit legal requirement. For others, it’s the consequence of prevailing laws and regulatory structures. The mandatory requirement defined by the Australian Government for its agencies sets the tone: “Agencies must establish a business continuity management (BCM) program to provide for the continued availability of critical services and assets, and of other services and assets when warranted by a threat and risk assessment.” And for the rest? There’s a strong argument to be made that business continuity management is no longer a choice for any enterprise – and that an obligation for BCM is a good thing anyway.



Enterprise executives are under intense pressure these days to deliver a wide range of new and expanding data services amid shrinking IT budgets. It seems that the front office has taken to heart the mantra “do more with less” and is buoyed by the notion that the cloud will come to the rescue when confronting all manner of data challenges.

This is part of the reason why top research firms like Gartner are starting to pull back on their IT spending predictions. As I noted last week, the company recently trimmed its 2014 growth outlook by about a third, from a 3.2 percent expansion to 2.1 percent, even though that still represents a substantial $3.75 trillion market. A deeper dive into the numbers, however, shows a data center hardware market flat-lining for the next year at about $140 billion, while an oversupply of SAN and NAS systems is likely to drive prices down even further. IT services, meanwhile, are looking at about 3.8 percent growth this year, representing nearly $1 billion in revenue.

But is it really that bad? Are we on a one-way street to a utility-style, all-cloud data center? Not hardly, at least not yet. The fact remains that there are plenty of compelling reasons for enterprises of all stripes to build and maintain local data infrastructure, both as stand-alone IT environments and hybrid systems tied to third-party resources.



Tuesday, 08 July 2014 16:15

Shadow IT Is Risky Business

A few months ago, I was asked to write an excerpt on shadow IT for an e-book. I had to decline because I didn’t know much about shadow IT. Heck, I didn’t know anything about shadow IT—or so I thought. I just didn’t recognize it by that name. It turns out that it is a topic I’ve touched on; that whole idea of employees using outside technology, particularly cloud technologies, for business purposes but doing so without permission from the IT department. Thanks to free applications, downloads and the rise of BYOD, shadow IT has become common in the workforce. A study released earlier this year by Frost & Sullivan Stratecast and commissioned by McAfee defined shadow IT in this way:

SaaS applications used by employees for business, which have not been approved by the IT department or obtained according to IT policies. The non-approved applications may be adopted by individual employees, or by an entire workgroup or department. Note that we specified that the non-approved applications must be used for work tasks; this study is not about tracking employees’ personal Internet usage on company time.



Emergency dispatchers and response teams are struggling with a widening language divide as they attempt to service Waterloo’s growing population of non-English speakers.

The communication barrier creates problems for all parties involved, from the dispatcher deciphering a 911 call to the officer trying to put together an accurate police report to the concerned resident trying to communicate a problem with little to no knowledge of the English language.

Over recent years, Waterloo Police have dealt with a slew of languages including Bosnian, Spanish, Serbian, Croatian, Burmese, French and Vietnamese.

In 2006, Burmese refugees began settling in Waterloo for the employment opportunities at Tyson's meat plant, and the community has been growing ever since.



In the study measuring effects of enterprise risk management (ERM) maturity—as  defined by the RIMS Risk Maturity Model (RMM) assessment—no attribute had a more meaningful impact on bottom line corporate value than Performance Management. The correlation is not an accident. While many organizations say they have an effective handle on risk, their ability to execute the policies and procedures they’ve put into place are severely lacking.

The sixth RMM attribute of ERM Maturity, Performance Management, measures the ability for an organization to execute vision and strategy through the effective use of a balanced scorecard.



There’s allot of talk of organization’s becoming resilient and how they need to be resilient if they are to compete successfully and respond accordingly to the ever increasing disasters of the world – both man-made and natural in causation. But that begs the question: Can organizations be resilient? In this practitioner’s opinion, yes, they can though it takes more than a single aspect to become resilient.

Many would have you believe that you can buy resiliency off a shelf; a service or product purchased from a firm touting that they can make your organization resilient, as though the procurement of a ‘product’ will make an organization resilient. Well, unless they are a pseudo-psychologist or have a background in leadership psychology, they can’t; at least not completely. Sure, it’s fine to say that Business Continuity Plans (BCP) and Technology Recovery Plans (TRP) et al will make an organization resilient but that’s just not the complete picture. It’s only part of the overall picture.

It’s just not a simple concept – though it would be great it if was. What will make an organization resilient? Is there some sort of magic ingredient that will suddenly ensure that an organization will bounce back from any adverse situation? Well, yes and no. It’s not one single ingredient, it’s multiple ingredients that when combined just so, will help any organization get through difficult situations.



Monday, 07 July 2014 17:14

On the Cusp of a New Data Environment

Hindsight is often 20/20, but sometimes foresight can be illuminating, too.

Gartner caused a mini-stir this week when it issued its latest prediction for data center spending in the coming years. Despite the rebounding economy and the drive to build out cloud infrastructure, the group is actually dialing back the rate of growth by a rather hefty margin. Rather than the 3.2 percent growth that the company anticipated earlier in the year, the forecast is now set at 2.1 percent, which translates to about $3.75 trillion.

Of course, this is still a significant wad of cash, representing the sum total of all data-related spending across the globe, ranging from devices and data center systems to software solutions, telecom and the wealth of new services that are hitting the market at a steady clip. For IT’s part, Gartner is expecting a still respectable 3.7 percent climb into 2015, representing about $3.9 billion in revenues.



WASHINGTON -- The Federal Emergency Management Agency (FEMA) and its federal partners continue to monitor Hurricane Arthur’s impact and northward track. The agency encourages those in Arthur’s path to listen to their local officials, monitor storm conditions and take steps to be prepared.

"Residents are urged to continue to listen to the instructions of your local officials," said Craig Fugate, FEMA Administrator.  "As the storm continues to move along the east coast, there are a number of areas that can be affected by strong winds, storm surge, inland flooding and tornadoes. If you evacuated and are considering returning home, make sure local officials have deemed the area safe to return.” 

Through regional offices in Atlanta, Philadelphia, New York and Boston, FEMA remains in close contact with emergency management partners in North Carolina and potentially affected states and has a liaison in the emergency operations center in Massachusetts. FEMA is also working in coordination with the National Weather Service and National Hurricane Center.

In advance of the storm, FEMA had liaisons in the emergency operation centers in North Carolina and South Carolina and an Incident Management Assistance Team (IMAT) in North Carolina to coordinate with state, tribal and local officials should support be requested or needed. Additional teams from around the country are ready to deploy to impacted states and tribes as necessary.

According to the National Weather Service, Tropical Storm Warnings remain in effect for portions of the east coast as Hurricane Arthur moves northward. The latest storm tracks, local forecasts and warnings are available at hurricanes.gov and weather.gov.

As the first hurricane of the Atlantic hurricane season, Hurricane Arthur serves as a reminder for residents in areas prone to tropical storms and hurricanes to refresh their emergency kits and review family emergency plans. Those who do not have an emergency kit or family plan can learn about steps to take now to prepare for severe weather at ready.gov.

The FEMA smartphone app provides safety tips and displays open shelter information at www.fema.gov/smartphone-app. Information on Red Cross shelters is available by downloading the Red Cross Hurricane app or by visiting redcross.org.

Safety and Preparedness Tips

  • Residents and visitors in potentially affected areas should be familiar with evacuation routes, have a communications plan, keep a battery-powered radio handy and have a plan for their pets. Individuals should visit ready.gov or listo.gov to learn these and other preparedness tips for tropical storms.
  • Know your evacuation zone and be sure to follow the direction of state, tribal and local officials if an evacuation is ordered for your area.
  • Storm surge is often the greatest threat to life and property from a hurricane. It poses a significant threat for drowning and can occur before, during, or after the center of a storm passes through an area. Storm surge can sometimes cut off evacuation routes, so do not delay leaving if an evacuation is ordered for your area.
  • If you encounter flood waters, remember – turn around, don’t drown.
  • Driving through a flooded area can be extremely hazardous and almost half of all flash flood deaths happen in vehicles. When in your car, look out for flooding in low lying areas, at bridges and at highway dips. As little as six inches of water may cause you to lose control of your vehicle.
  • If your home has flood water inside or around it, don’t walk or wade in it. The water may be contaminated by oil, gasoline or raw sewage.
  • Hurricanes have the potential for tornado formation. If you are under a tornado warning, seek shelter immediately in the center of a small interior room (closet, interior hallway) on the lowest level of a sturdy building. Put as many walls as possible between you and the outside.
  • Stay off the roads in impacted areas. Emergency workers may be assisting people in flooded areas or cleaning up debris. You can help them by staying off the roads and out of the way.
  • If your power is out, safely use a generator or candles.
    • Never use a generator inside a home, basement, shed or garage even if doors and windows are open.
    • Keep generators outside and far away from windows, doors and vents. Read both the label on your generator and the owner's manual and follow the instructions. 
    • If using candles, please use caution. If possible, use flashlights instead.
  • Avoid downed power or utility lines; they may be live with deadly voltage. Stay away and report them immediately to your power or utility company.
  • When the power comes back on, wait a few minutes before turning on major appliances, to help eliminate problems that could occur if there's a sharp increase in demand. If you think electric power has been restored to your area but your home is still without power, call your local power company.
  • Get to know the terms that are used to identify severe weather and discuss with your family what to do if a watch or warning is issued.

For a Tropical Storm:

  • A Tropical Storm Watch is issued when tropical cyclone containing winds of at least 39 MPH or higher poses a possible threat, generally within 48 hours.
  • A Tropical Storm Warning is issued when sustained winds of 39 MPH or higher associated with a tropical cyclone are expected in 36 hours or less.

For coastal flooding:

  • A Coastal Flood Advisory is issued when minor or nuisance coastal flooding is occurring or imminent.
  • A Coastal Flood Watch is issued when moderate to major coastal flooding is possible.
  • A Coastal Flood Warning is issued when moderate to major coastal flooding is occurring or imminent.

More safety tips on hurricanes and tropical storms can be found at ready.gov/hurricanes.

Not every company has a Big Data problem. In fact, many companies are operating in “relatively sparse data environments,” says David Meer, a partner with Strategy&’s consumer and retail practice.

This isn’t your usual rant about how companies need to fix small data problems before embracing Big Data. No, Meer’s Strategy+Business article is much more original. He’s proposing that companies revisit existing data, and then seek out ways to add to or fill out that data for strategic advantage.

Why would they do this? It turns out the market doesn’t care if you don’t have large datasets and can’t afford to buy them.  You still need to compete against data-driven companies.



Cyber security and data protection have been ranked a surprising third in a list of boardroom priorities, according to a survey released by KPMG.

The annual Business Instincts Survey, a poll of 498 C-level executives from businesses across the UK, found that under-investment has left many businesses acknowledging the need to increase spending on secure technology. Yet despite acceptance that cyber security, specifically, is critical to long-term business operations, one in three executives questioned (36 percent) said that investing in people skills had become their number one concern, with 19 percent also more focused on plant or machinery purchases.



Most organizational decisions to try to slow or ban Bring Your Own Device (BYOD) in the workplace seem to circle around the security issues. Which, of course, are valid and concerning to IT groups who must balance conflicting security and productivity or convenience needs for users. CIO.com, for instance, describes a large electrical contractor, Rosendin Electric, that has a no-BYOD policy. Employees keep asking about it, but CIO Sam Lamonica worries about security breaches and says, “We have a user base that might not, in a lot of cases, make the right choices.” The article also cites a CompTIA survey of 400 IT and business execs in which just over half said they are not “doing” BYOD, period.

But CIOs and IT managers are also now dealing with less quantifiable problems that may grow along with BYOD and the mobile worker’s lifestyle. These problems range from angst and worry over job loss, to fear of being expected to work unlimited hours, to uncertainty about which responsibilities could increase with BYOD’s freedom.



Monday, 07 July 2014 17:09

Five Ways MDM Benefits Business Users

Some experts think too many organizations are approaching master data management (MDM) as a “must-do” without really understanding or achieving its potential. In fact, Forrester MDM and data expert Michele Goetz says MDM isn’t something every company should pursue.

If you’re interested in drilling down on the potential of MDM, check out this recent Infosys BPO blog post. Granted, as a technology consultancy, it’s good business for the company to promote MDM (did you see that their CEO is now India’s highest paid executive?) and it may have elements of their model in it. But mostly, it seems pretty straightforward, with solid information.

The blog post provides some telling statistics, although it doesn’t source the surveys or provide specifics, so it’s impossible to judge their legitimacy. For instance, the piece cites a 2013 survey that found only 21 percent of organizations rated their data quality as high or better, with most rating it “fair.” I will say that information falls in line with past research that I’ve read.



NEW YORK – New Yorkers know about severe weather. After Hurricane Sandy, 2013 brought 15 significant weather events to New York, including winter snow and ice storms, a tornado, extreme heat, brush fires, heavy rains and flooding. Two of those events resulted in major disaster declarations for the state.

Next week, March 2-8, is National Severe Weather Preparedness Week, a nationwide campaign to remind everyone that severe weather can affect anyone. The effort is sponsored by the Federal Emergency Management Agency and the National Oceanic and Atmospheric Administration.

Across the U.S. last year, there were seven severe weather events that crossed the $1 billion mark in economic and property damage. These disasters, including floods, tornadoes and wildfires, caused the deaths of 109 people.

NOAA and FEMA urge all New Yorkers to understand the risks where you live and how severe weather could affect you and your family. Check weather forecasts, get a NOAA Weather Radio and sign up for local weather alerts from emergency management officials. Check NOAA’s National Weather Service website for more information: www.weather.gov.

Next, make sure you and your family are prepared for severe weather. Develop a family communication and disaster preparedness plan, keep important papers, medications and other essential items in a safe place and visit www.Ready.gov/severe-weather to learn more.

Being prepared for severe weather need not be complicated or costly. Keeping a few simple items handy in a disaster kit, for example, could end up being a lifesaver. Go to www.ready.gov/basic-disaster-supplies-kit to find out more about what to include in a basic kit and how to develop one for those with unique needs. The same information is available in Spanish at www.listo.gov.

Once you have taken action to prepare for severe weather, set an example by sharing your story with family and friends on any social media site. Be a "force of nature" and inspire others in your community to take action too. Pledge to prepare by signing up for America’s PrepareAthon on April 30 at www.fema.gov/americas-prepareathon.

The Federal Emergency Management Agency (FEMA), through its National Watch Center in Washington and its regional office in Atlanta, and in coordination with the National Weather Service and National Hurricane Center, is monitoring the conditions of Tropical Storm Arthur off the east coast of Florida. FEMA remains in close contact with state emergency management partners in potentially affected states.

According to the National Weather Service, a Tropical Storm Watch is in effect for the east coast of Florida from Fort Pierce to Flagler Beach. A Tropical Storm Watch means that tropical storm conditions are possible within the watch area, in this case within 24 hours. Tropical Storm Arthur is expected to move northwest today and then north on Wednesday. Arthur is expected to become a hurricane by Thursday near the coast of the Carolinas. Visit Hurricanes.gov  and Weather.gov for the latest storm track and local forecasts.

FEMA urges residents and visitors in potentially affected areas to closely monitor the storm and take steps now to be prepared in advance of severe weather and most importantly, follow the direction of state, tribal and local officials.

FEMA has deployed liaisons to the emergency operations centers in North Carolina and South Carolina along with an Incident Management Assistance Team (IMAT) to North Carolina to coordinate with local officials, should support be requested, or needed. FEMA’s regional office in Atlanta is in contact with its emergency management partners in Florida, North Carolina and South Carolina. FEMA’s National Watch Center is at an Enhanced Watch.

As the first tropical storm of the Atlantic hurricane season, Tropical Storm Arthur serves as a reminder for residents in areas prone to tropical storms and hurricanes to refresh their emergency kits and review family plans. If you do not have an emergency kit or family plan, or to learn about steps you can take now to prepare your family for severe weather, visit ready.gov.

At all times, FEMA maintains commodities, including millions of liters of water, millions of meals and hundreds of thousands of blankets, strategically located at distribution centers throughout the United States, that are available to state and local partners if needed and requested.

Tropical Storm Safety Tips:

  • Residents and visitors in potentially affected areas should be familiar with evacuation routes, have a communications plan, keep a battery-powered radio handy and have a plan for their pets. Individuals should visit ready.gov or listo.gov to learn these and other preparedness tips for tropical storms.
  • Know your evacuation zone and be sure to follow the direction of state and local officials if an evacuation is ordered for your area.
  • Storm surge is often the greatest threat to life and property from a hurricane. It poses a significant threat for drowning and can occur before, during, or after the center of a storm passes through an area. Storm surge can sometimes cut off evacuation routes, so do not delay leaving if an evacuation is ordered for your area.
  • Driving through a flooded area can be extremely hazardous and almost half of all flash flood deaths happen in vehicles. When in your car, look out for flooding in low lying areas, at bridges and at highway dips. As little as six inches of water may cause you to lose control of your vehicle.
  • If you encounter flood waters, remember – turn around, don’t drown.
  • Tropical Storms have the potential for tornado formation. If you are under a tornado warning, seek shelter immediately in the center of a small interior room (closet, interior hallway) on the lowest level of a sturdy building. Put as many walls as possible between you and the outside.
  • Get to know the terms that are used to identify severe weather and discuss with your family what to do if a watch or warning is issued.

For a tropical storm:

  • A Tropical Storm Watch is issued when tropical cyclone containing winds of at least 39 MPH or higher poses a possible threat, generally within 48 hours.
  • A Tropical Storm Warning is issued when sustained winds of 39 MPH or higher associated with a tropical cyclone are expected in 36 hours or less.

For coastal flooding:

  • A Coastal Flood Watch is issued when moderate to major coastal flooding is possible.
  • A Coastal Flood Warning is issued when moderate to major coastal flooding is occurring or imminent.
  • A Coastal Flood Advisory is issued when minor or nuisance coastal flooding is occurring or imminent.

More safety tips on hurricanes and tropical storms can be found at ready.gov/hurricanes.

This blog article talks about a step in the Business Continuity Planning (BCP) Methodology that I think is missing – and, I happen to think it is a pretty important step.

One of the greatest challenges in the BCP methodology is in establishing the program’s recovery objectives.  Whether you label them as Maximum Acceptable Downtime (MAD); Recovery Time and Recovery Point Objectives (RTO & RPO); or some other creative anagram unique to your process, these program benchmarks are usually arrived at through a Business Impact Analysis (BIA) process or, at least, through some survey/interview with business managers and subject matter experts to establish what the critical business processes are; what timeframes they must be recovered; and what resources must be available in certain timeframes to enable our continuity or recovery of those processes.  Does this sound familiar?  I’m I right, so far?

But – you knew there was going to be a but – to achieve what end?  I mean, we do a great job defining business continuity objectives, but do we do so against established business objectives?



At a recent meeting, the London Assembly’s Economy Committee heard that London’s businesses are failing to adequately invest in suitable climate change risk mitigation strategies.

The Economy Committee were told that large companies have substantial strategies in place to deal with climate change risks; however, SMEs across the capital are generally unaware of the significant threats to their business posed by climate change and severe weather events both in London and to their supply chains abroad.

Jenny Jones AM, Chair of the Economy Committee said:

“It is vital that business owners in London, SMEs and large companies alike, understand the very real risk that climate change and severe weather events, both here and abroad can have on the future success of their companies. But today we heard that SMEs, which account for 90 percent business in the capital, do not have the resources to protect and adapt themselves to the impact of severe weather events.



National Retail System (NRS) has released the results of a survey into how a West Coast ports strike could impact logistics in the USA. With a strike looming as the holiday season gets nearer, the NRS' survey found that only 52 percent of the companies who responded are prepared for such an eventuality.

The logistics professionals surveyed came from a variety of different sectors including 36 percent in manufacturing and 18 percent in retail, as well as 23 percent working for other 3PL logistics providers across the USA.

The anticipated strike has seen a variety of contingency plans being put in place. The most popular of those is to use alternative ports. The biggest winners of the alternative ports are the New England ports New York / New Jersey and Boston with 39 percent of businesses choosing to route trade through these. The up and coming port of Savannah, Georgia is the next most popular option with 26 percent, and the Canadian Port of Vancouver is seen as the third best option for a further 23 percent of companies. While all of these ports are likely to see a short-lived boom if the strikes take place what will be interesting is how much trade will not return after a strike ends and still be routed through these destinations. Will businesses want to mitigate future risk and leave a proportion of their imports coming through alternatives?



Technology is beginning to dominate many aspects of the emergency management profession. This is particularly evident during disaster response. Today we have a number of large technology companies that offer their software or services for larger scale disasters. Chief technology officer for Microsoft Disaster Response, Tony Surma, answered questions about technology’s use in emergency management.
Surma is responsible for the worldwide team and program at Microsoft focused on delivering technologies and technical assistance to communities, responders and customers both in response to natural disasters and in support of proactive resiliency efforts. He has been a part of the Microsoft Disaster Response team from the start — first as a volunteer global coordinator for solutions builds and deployments in time of disaster response and, more recently, as the lead for the program. Between response efforts, his focus is on building proactive partnerships and cross-organization initiatives, such as Humanitarian Toolbox, to operationalize innovations for use during response and leverage trends in technology and solution development to the benefit of response organizations and community readiness.



The Business Continuity Institute (BCI) and the Association of Contingency Planners (ACP) are proud to announce a new strategic partnership that will further enhance the networking opportunities offered to business continuity professionals across North and Central America.

Networking and the sharing of ideas and experience are fundamental benefits of being a member of a professional institute but this is not always easy in regions so large and diverse as North and Central America. By forming this alliance between the BCI and the ACP it will help address those challenges.

As part of the partnership, BCI members will have access to local ACP Chapter meetings, events and services. The BCI will also participate in the ACP’s National Leadership Conference, where it will be able to highlight its influence in the discipline of business continuity from its global membership of more than 8,000 members in more than 100 countries.

Commenting on this new partnership, Steve Mellish FBCI, Chairman of the BCI, said: “The BCI and the ACP have worked together in an informal way in the past as both share the common goal of promoting the need for business continuity within organizations of all shapes and sizes. This new strategic partnership builds on that relationship.  With the continuing evolution of the discipline, partnering with the ACP will bring more networking opportunities for both memberships as well as access to the BCI's world-renowned thought leadership activities much more effectively. These are very exciting times for the BCI in the Americas and we are proud to partner with the ACP to work together for the benefit our members.”

Michael Gifford MBCI, ACP Chairman said: "ACP is committed to the business continuity profession and as an organization dedicated to protecting lives, safeguarding businesses and fostering community resiliency. Our new partnership will create new growth opportunities for both ACP and BCI. We are very pleased to take this journey with the BCI."

The ACP has Chapters across North America, so if you are interested in finding out more about the Chapter local to you then click here.


If you look through the literature on disaster recovery, you’ll probably see that practical ideas, recommendations and methods abound – but that theory is in rather shorter supply. This makes sense in that all those IT systems and networks are running now – so if they break, you’ll want some good ‘cookbooks’ or ‘how-to’s’ for mending them rapidly. However with DR management comes DR planning, which is the chance to step back and better understand the key principles that govern effective DR. The CAP theorem for distributed IT systems is one example. Better still, it’s simple to grasp and has immediate practical application.



MONTGOMERY, Ala. – Some disaster survivors think that U.S. Small Business Administration loans are only for businesses. That is not the case – it is the primary source of federal funds for long-term recovery assistance for disaster survivors.

SBA offers disaster loans at rates as low as 2.188 percent to homeowners and renters, at 4 percent for businesses of all sizes and at 2.625 percent for private nonprofit organizations for physical damage from the April 28 through May 5 severe storms, tornadoes, straight-line winds and flooding in the following Alabama counties: Baldwin, Blount, DeKalb, Etowah, Jefferson, Lee, Limestone, Mobile and Tuscaloosa counties.

Economic injury disaster loans also are available to provide working capital to eligible small businesses and nonprofit organizations located in the counties listed above and the adjacent counties.

There are good reasons for FEMA applicants who have been contacted by SBA to submit a completed disaster loan application before the July 1, 2014 deadline. Reasons include:

  • A future insurance settlement may fall short. Survivors may find out they are underinsured for the amount of work it takes to repair or replace a damaged home. An SBA low-interest loan can cover the uninsured costs. By submitting the loan application, survivors may have loan money available when it is needed. SBA can approve a loan for the repair or replacement of a home up to $200,000. The loan balance will be reduced by a survivor’s insurance settlement. However, the opportunity for an SBA disaster loan will be lost if they wait until after the application deadline.
  • SBA can help renters repair or replace disaster damaged personal property. Renters as well as homeowners may borrow up to $40,000 to repair or replace clothing, furniture, appliances and damaged vehicles.
  • By submitting an SBA loan application, survivors keep the full range of disaster assistance available as an option. SBA may refer applicants who do not qualify for a home loan to FEMA for “Other Needs” grants to replace essential household items, replace or repair a damaged vehicle, cover medical, dental and funeral expenses and other serious disaster-related needs. But if survivors do not submit their disaster loan applications, the assistance process stops. Survivors are not required to accept a loan offer.

For more information, homeowners, renters and businesses may call the SBA at 800-659-2955 (TTY 800-877-8339), send an email to DisasterCustomerService@SBA.gov or visit SBA.gov/Disaster. Survivors can complete disaster loan applications online at https://DisasterLoan.SBA.gov/ELA.

Survivors who have not yet registered with FEMA can do so online at DisasterAssistance.gov with a mobile device at m.FEMA.gov or by calling the FEMA helpline at 800-621-3362 (FEMA). TTY 800-462-7585.

The deadline to register for disaster assistance and an SBA loan is July 1, 2014 for property damage. The deadline for Economic Injury Disaster Loans is February 2, 2015.

The Federal Emergency Management Agency and the U.S. Small Business Administration offer assistance programs for homeowners, renters, and business owners in nine Alabama counties designated for Individual Assistance.

High-profile Big Data success stories tend to focus on ridiculously large volumes and trendy data, such as social media data. In the real world, Big Data looks a lot different, according to data management consultant Gary Allemann.

Allemann is the managing director at the South African consultancy Master Data Management, so right off the bat you know he will have a different perspective on Big Data than the Silicon Valley set. In “Five More Big Data Myths Busted,” Allemann argues that for many companies, Big Data’s value has little to do with astronomical volumes of data or even social media data.

And Big Data is certainly not gunning to take over the enterprise data warehouse at this point, he adds. Actually, companies adopt Big Data as a supplement to the enterprise data warehouse because Big Data solutions allow them to combine structured data with unstructured data.



It is often said that the most important asset an organization can have is its staff, so it would seem logical for an organization to have a plan in place for when staff move on taking their skills and knowledge with them. This is not always the case however, according to a white paper by SEI and FP Transitions, less than half (45%) of advisors polled have a continuity plan in place in the event of an unexpected departure or leave of absence. This is despite the claim that 99% of independent financial services and advisory practices go out of business when their founder retires.

The white paper, titled, ‘Acquisition and Succession: Shift Your Focus from Retirement to Growth,’ surveyed 771 financial advisors to gain insights on their acquisition, succession planning, and continuity planning activities. It noted that firms must increasingly view succession planning as a growth strategy not a retirement strategy, and reveals that while nearly one-third (32%) of advisors claim to have a succession plan, only 17% have a binding and actionable agreement. This data points to the need for advisors to re-assess their succession planning goals and strategies.

"Advisors are beginning to realize that succession plans and continuity plans can actually become growth tools,” said John Anderson, Head of SEI Practice Management Solutions, SEI Advisor Network. “By taking the time to plan for the future, advisors are giving themselves a key competitive advantage in the present. The process gives them a clearer picture of their firms' overall health, prioritizes finding a new generation of talent, and sends the message to clients that the firm will be viable for years to come.”

"Succession planning isn’t just about figuring out who’s going to take over when you’re gone," said David Grau Sr., President and Founder of FP Transitions. "It’s about building a business that will support your long term vision, and which will continue to serve clients even when you’re not around as much.  Whether that means preparing the firm for acquisition or extending ownership to the next generation, continued growth is essential to a successful transition."

The data suggests, however, that most advisors have given thought to succession planning and continuity planning, even if they do not currently have all of the tools needed to execute a plan/strategy. Of those without a business continuity plan, nearly three-quarters (69%) plan to implement one over the next few years.


In releasing its second quadrennial review, a 104-page report, the U.S. Department of Homeland Security (DHS) outlines its efforts to enhance the five homeland security missions it detailed in the first review in 2010.

With disasters like the Deepwater Horizon oil spill in 2010, Hurricane Sandy in 2012 and the Boston Marathon bombings in 2013, as well as the increasing cyberthreat as the backdrop, the report outlined what it called a more risk-based approach to the significant threats from terrorism and natural hazards.

Of course the mission of the DHS continues to be combating terrorism, but also taking an all-hazards approach and recognizing the trends in natural hazards brought on by a changing climate, and to understand and mitigate the possibilities of a devastating pandemic.



Why are some countries more resistant to supply chain disruption or better able to bounce back?

According to Margareta Wahlström, United Nations Special Representative of the Secretary-General (SRSG) for Disaster Risk Reduction, this is a puzzle that world leaders are perpetually trying to solve.

Hence the inherent value in a new online interactive tool from FM Global that ranks countries by supply chain resilience.

The 2014 FM Global Resilience Index ranks the business resilience of 130 countries around the world.

Nine key drivers of supply chain risk are grouped into three categories: economic, risk quality and supply chain factors. These combine to form the composite index. Scores are bound on a scale of 0 to 100, with 0 representing the lowest resilience and 100 the highest resilience.



Fortune CM&S and the Business Continuity Institute have announced the establishment of a formal, strategic partnership. The goal of this new collaborative partnership is to dramatically increase the level of awareness among Fortune’s North American business readers of the critical importance of business continuity management and to help raise the levels of resilience within their own corporations worldwide to ensure their organization’s long term success.

Newell Thompson, Vice President at Fortune’s Content Marketing & Strategies division said: "Fortune CM&S division selected the BCI to partner with them on this high profile feature because of the BCI's outstanding reputation as the world's leading institute for business continuity and their standards of excellence in the practice of Business Continuity Management (BCM). The BCI's mission of promoting and facilitating the adoption of international standards for business continuity helps to raise awareness of the importance of the practice of BC globally. The BCI works with some of most well respected global brands who embrace the practice of BCM and want to raise their corporate profile in the global BC arena."



Wednesday, 25 June 2014 16:09

The BIA Insult

So, I came across this quote the other day that someone was using in a presentation about the importance of conducting a Business Impact Analysis (BIA):

“A business continuity plan that is not predicated on or guided by the results of a business impact analysis (BIA) is, at best, guesswork, is incomplete, and may not function as it should during an actual recovery.”


I understand what they mean and I appreciate this message given to business continuity planners, but, I would hesitate saying this in a board room.  It may not be wise suggesting to the CEO and other senior executives that they do not know their business well enough to tell you what is important to them and what business processes are necessary to keep their organization solvent.



Each year hundreds of emergency management researchers, academics and practitioners gather to discuss the state of research across fields and hazards. The Natural Hazards Research and Applications Workshop is held just south of Boulder, Colo., in Broomfield, making the devastating flooding last September a natural fit as a focus throughout the conference. Representatives from Boulder and Lyons spoke about the current state of the response as well as how they had prepared their communities in advance of the flood, and researchers addressed what they observed during the emergency, including the use of emergency alerts. The following are six takeaways about the flood response that were shared during the conference.

ALERTS NEED TO BE SPECIFIC — What’s the best way to alert residents about an emergency? While the ideal order to list information has been debated, one thing has become clear from studies of emergency alerts: be specific. “Explain to people what you mean when you say ‘evacuate,’ otherwise they will make it up themselves,” said Dennis Mileti, director emeritus of the Natural Hazards Center, which hosts the workshop. Social scientists have said that warnings must tell people what to do, and Mileti said this was alive and well during the flooding in Boulder last September. For example, he cited an alert from the Boulder Office of Emergency Management that went out on Sept. 12, 2013, that said, “Shelter in place but move to upper floors, if possible. If this is not possible, these individuals should seek higher ground, at least 12 feet above creek level, without crossing the creek.” Mileti said he’s read all of the warnings that were issued during the flood and that Boulder did a “wonderful job” issuing warnings during the event.



Wednesday, 25 June 2014 16:06

Why You Should Still Worry About Heartbleed

CSO — Patching of Internet-connected systems that contain the Heartbleed bug has slowed to a snail's pace, and security experts are advising companies to take extra precautions to avoid a security breach.

Errata Security scanned the Internet late Friday and found roughly 309,000 sites with the bug, which is in the secure sockets layer (SSL) library of the OpenSSL Project. That number was only about 9,000 less than what Errata found a month ago.

When Hearbleed was discovered in April, Errata found more than 600,000 vulnerable systems on port 443, which is used by default for SSL-secured communications between clients and servers.



Wednesday, 25 June 2014 15:50

Why One CIO Is Saying 'No' to BYOD

CIO — Every six months, an employee at electrical contractor Rosendin Electric will walk into CIO Sam Lamonica's office in San Jose with a question: "How come I can't use my own phone for work?"

Rosendin Electric has thousands of employees, hundreds of smartphones, more than 400 iPads and a few Microsoft Surface tablets -- none are Bring Your Own Devices.

"We would probably never have a BYOD environment here," Lamonica says.

Lamonica isn't alone, either. There's a growing BYOD backlash among CIOs that threatens to derail the once-high-flying computing trend. For instance, CompTIA's spring survey of 400 IT and business executives found that 51 percent of respondents at large companies are not doing BYOD at all.



Although cyber attack now ranks among the top risks facing the global business community, many European boards face the challenge of adequately analysing and assessing how the threats associated with technology and the internet may affect their organisations.

To assist these firms in managing these cyber risks more effectively, Marsh has developed a model that helps users identify and evaluate the cyber risk scenarios they face, analyse their insurability and risk tolerance, and then model their insurable and non-insurable losses. The reporting data can then be used for risk financing, in the insurance market or through self-insurance.



Tuesday, 24 June 2014 16:30

Native Data Analysis Comes to MongoDB

CIO — Seeking to make it easier for you to apply analytics to your big data stores, Pentaho today announced the general availability of the latest version of its business analytics and data integration platform.

The Pentaho 5.1 release is intended to bridge the "data-to-analytics divide" for the whole spectrum of Pentaho users, from developers to data scientists to business analysts. Pentaho 5.1 adds the capability to run code-free analytics directly on MongoDB data stores, incorporates a new data science pack that acts as a data science "personal assistant," and adds full support for the Apache Hadoop 2.0 YARN architecture for resource management.

"The new capabilities in Pentaho 5.1 support our ongoing strategy to make the hardest aspects of big data analytics faster, easier and more accessible to all," says Christopher Dziekan, executive vice president and chief product officer at Pentaho. "With the launch of 5.1, Pentaho continues to power big analytics at scale, responding not only to the demands of the big data-driven enterprise but also provides companies big and small a more level playing field so emerging companies without large, specialist development teams can also enter the big data arena."



CIO — You've certainly heard a lot about the cloud — the public cloud, that is, run by software vendors and outsourced completely. You've heard the standard advice about why the public cloud has certain technical advantages and disadvantages, too. However, there's an inconvenient truth to the public cloud that has been brewing for a while: Its effect on IT pros.

Cloud Vendors Creating False Choices

As part of software companies' push to move their customers to cloud versions of their products, many companies introduce features or capabilities available in the hosted service versions of their programs that aren't immediately available in the on-premises version of the software. In some cases, these features aren't on the roadmap at all to be ported to on-premises systems.

We've heard from Microsoft, for example, that major server products such as Exchange and SharePoint will be as close to feature equivalent as possible. We've even heard promises that new technology such as the Office Graph will be ported back to the boxed software release designed to be run in your server closet. These commitments have been walked back, much to the dismay of existing customers and IT pros.



Business continuity often inspires a feeling of ‘disaster averted’. In other words, the perception is that spending money on business continuity is really an insurance policy, and as such brings no positive benefit, but helps to avoid negative outcomes. It’s true that this is an essential role. As its name suggests, the avoidance of business discontinuity or interruption is inherent in the pursuit of business continuity. However, business continuity can and should have a net positive effect as well.



In an unexpected twist, Big Data is driving adoption of data archiving, according to Gartner.

When people first started talking about Big Data technology, some said it would eliminate the need to worry so much about archiving or, at least, Hadoop clusters would take on that role. Ironically, it’s the increased adoption of Hadoop that is now forcing organizations to look at data archiving, CMS Wire reports.  Growth of structured data is particularly a concern as organizations try to separate out useful from non-essential data, the article notes.

Don’t worry, it’s expected to come full circle. The article notes that Gartner’s latest report, the Magic Quadrant for Structured Data Archiving and Application Retirement, predicts that the archiving needs will be so robust by 2017, 75 percent of structured data archiving applications will have to incorporate Big Data analytics.



Tuesday, 24 June 2014 16:26

Could BYOD Increase Insider Threats?

A new study commissioned by Raytheon and conducted by the Ponemon Institute provides a fresh look at the insider threat. In a nutshell, we can expect the insider threat to increase. According to FierceMobileIT:

Focusing on 'the human factor,' the survey report, "Privileged User Abuse & The Insider Threat" finds that many individuals with the highest levels of network access in organizations are often granted access to data and areas of the network not necessary for their roles and responsibilities. The report reveals that 65 percent of survey respondents indicated that curiosity – not job necessity – drives these same individuals to access sensitive or confidential data.



It’s only a matter of time before a catastrophic earthquake hits the Pacific Northwest, but what happens after the shaking subsides?

Aging buildings across the area would likely collapse, causing scores if not hundreds of deaths and injuries. Roads could become impassable, and many businesses throughout the region would likely cease to offer services for some time — completely changing the face of our region and the communities within as we know them.

The scenario is real, and that’s what brought engineers, emergency managers, public officials and interested citizens from across the Northwest to Centralia College on Thursday. The second day of the Construction and Best Practices Summit hosted by the college and the Pacific Northwest Center of Excellence for Clean Energy focused on how to best prepare for and recover from an earthquake along the Cascadia Subduction Zone, a 1,000-kilometer fault stretching from Vancouver Island to Cape Mendocino, Calif.



It seems that the march to private cloud infrastructure is finally under way in earnest, with both the technology and the business case for its deployment at a sufficient level of maturity for large numbers of enterprises to pull the trigger.

This does not mean all questions have been answered, however. In fact, if the private cloud has anything in common with legacy infrastructure, it’s that the tweaking and fine-tuning will likely continue well into the future.

One of the first dilemmas in fact, is the selection of a platform. To date, VMware has captured the lead in enterprise cloud deployments, according to database service provider Tesora, although OpenStack is rapidly closing the gap. In the company’s latest survey of North American developers, VMware owns about 15 percent of the market, compared to OpenStack’s 11 percent. Top applications for both public and private clouds are database processing for SQL, MySQL and other platforms, followed by web services and quality assurance. Interestingly, only about 9 percent indicated compatibility with Amazon Web Services as a top priority in designing a private cloud.



CIO — We know in our gut that data has value. No company can run without it. But what is it really worth? As CEOs realize that data is an asset that can be exploited as a new source of revenue, they will start to ask CIOs about its financial potential. Responding with a shrug and a shot in the dark won't exactly enhance your own value in the CEO's eyes.

Patents, trademarks and other forms of intellectual property have long been accounted for as intangible assets in a company's financial reports. But those numbers are only estimates that may or may not include more mundane kinds of information, such as customer profiles. That's partly because no standard method or accounting procedure exists for putting a dollar value on data.

"It's frustrating that companies have a better sense of the value of their office furniture than their information assets," says Doug Laney, a Gartner analyst who studies information economics. "CIOs are so busy with apps and infrastructure and resourcing that very few of them have cycles to think about it."



There’s allot of talk of organization’s becoming resilient and how they need to be resilient if they are to compete successfully and respond accordingly to the ever increasing disasters of the world – both man-made and natural in causation. But that begs the question: Can organizations be resilient? In this practitioner’s opinion, yes, they can though it takes more than a single aspect to become resilient.

Many would have you believe that you can buy resiliency off a shelf; a service or product purchased from a firm touting that they can make your organization resilient, as though the procurement of a ‘product’ will make an organization resilient. Well, unless they are a pseudo-psychologist or have a background in leadership psychology, they can’t; at least not completely. Sure, it’s fine to say that Business Continuity Plans (BCP) and Technology Recovery Plans (TRP) et al will make an organization resilient but that’s just not the complete picture. It’s only part of the overall picture of what will make an organization resilient.

It’s just not a simple concept – though it would be great it if was. What will make an organization resilient? Is there some sort of magic ingredient that will suddenly ensure that an organization will bounce back from any adverse situation? Well, yes and no. It’s not one single ingredient, it’s multiple ingredients that when combined just so, will help any organization get through difficult situations.



Officials are using this time as an opportunity to tweak disaster plans, practice emergency drills and brace for potentially devastating storms later in the summer.

Ken Kaye, McClatchy News | June 20, 2014

Hurricane season so far has been business as usual for most of us, with the tropics nice and calm.

But for emergency managers, this slow stage is an opportunity to tweak disaster plans, practice emergency drills and brace for potentially devastating storms later in the summer.

"This is time of season when we're putting final touches on training, exercising and making sure we're ready," said Bill Johnson, Palm Beach County's emergency management director.



The news: It's Ebola.

The largest outbreak ever of the hemorrhagic fever is spreading "totally out of control" in West Africa, according to a senior official with Doctors Without Borders. The World Health Organization reports that some 330 deaths are now considered linked to the deadly virus in Guinea, Sierra Leone and Liberia.

"The reality is clear that the epidemic is now in a second wave," Doctors Without Borders operations director Bart Janssens told the AP. "And, for me, it is totally out of control."

German specialists previously said that the epidemic killing people across West Africa was caused by anew strain of the Zaire ebolavirus, which killed 88% of its victims in the first known outbreak in 1976. Doctors have managed to keep the fatality rate lower than previous epidemics, but the current outbreak is killing approximately 64% of its victims. 



Following Continuity Central’s recent survey into business continuity software usage we asked some of the key suppliers of business continuity software to answer a standard set of questions about trends in the business continuity software market. The responses can be read below:

More entries will be posted as they become available.

Software suppliers: if you would like to have your responses listed above please contact editor@continuitycentral.com

Friday, 20 June 2014 15:28

Lessons Learned from Heartbleed

By Russ Spitler

Without question, Heartbleed is one of the most catastrophic events from an Internet security standpoint over the past ten years, arguably ever. It had IT and security teams frantic to fix the vulnerability and the media frenzied. As the dust settles after the initial Heartbleed crisis response, what lessons are starting to emerge?

A quick recap

Heartbleed is a vulnerability in OpenSSL that permits attackers to access random blocks of memory from servers running OpenSSL. OpenSSL is used to establish encrypted communication channels between different places, and therefore the servers running this software hold some significant secrets: explicitly the encryption keys. Simply explained, the process used for setting up OpenSSL encryption uses a key-pair: a private key and a public key. These two keys are bound and you cannot replace one without also modifying the other. Then money is paid, fancy algorithms are applied and an SSL Certificate is obtained which is used to affirm identities when establishing a secure connection.



BC Management’s latest annual survey of compensation levels in the business continuity and related sectors has discovered that UK-based business continuity consultants / independent contractors make more than twice as much money as their US-counterparts, and more than three times as much as Canadian business continuity consultants.

The online study was launched in December 2013 and remained open through to March 2014. A total of 1,520 respondents from over 30 countries took part, with 116 independent contractors providing compensation information.

Independent contractors are defined as ‘professionals classified as performing contract work to another entity under terms specified in an agreement. Unlike an employee, an independent contractor does not work regularly for an employer, but works when and as required.’

The survey found that the 2013 average total compensation for independent contractors in $USD was:

UK: $303,278
Europe: $248,545
USA: $140,601
Asia Pacific: $130,311
Canada: $83,982

There was an average increase in total compensation for independent contractors, with earnings increasing 8.8 percent internationally and 6.3 percent for USA based professionals.


Luxury goods companies believe that they face greater reputational risk than those in other industries, according to a report published by ACE Group in Europe. Following a survey with a concentrated sample of 45 European luxury goods firms and a series of in-depth expert interviews, the report also concludes that environmental, business travel and directors and officers liability (D&O) are three emerging risks for the industry to watch.

Some 75 percent of senior risk executives from the industry sample state that reputation is their company’s greatest asset and 80 percent agree that reputational risk is the most difficult individual risk category to manage.

Almost six in ten respondents report that globalisation has increased the interdependency of risks they face and rank lack of risk management tools and processes, insufficient budget and lack of management time as well as human resources and skills as the greatest barriers to effective management of reputational risk.



Sometimes, the business doesn’t care about data quality. It’s a hard thing to hear, but someone has to be honest with you about it, and Capgemini’s Big Data and Analytics expert Steve Jones is stepping up to do it.

Actually, Jones is talking about master data management (MDM). It’s often confused as a data quality project, he writes, but the primary goal of MDM isn’t data quality these days. It’s really collaboration.

If that sounds like a major departure from what you’ve read in the past, you’re right. Data quality, along with data governance, has long been heralded as key components to finding success with MDM.



Thursday, 19 June 2014 16:22

Responding to global risks

Business leaders are not doing enough to prepare for the risks that arise from our increasingly inter-connected world, such as government debt crises, extreme weather events and social instability, claims the Institute of Directors in a new report, Responding to Global Risks: A practical guide for business leaders.

This is a concern highlighted in the 2013 BCI Supply Chain Resilience Report which identified that, as supply chains become ever more complex, organizations find it difficult to manage them effectively. In the survey that led to the report, 75% of respondents admitted to not having full visibility of their supply chain disruption levels and 42% had experienced a disruption below their immediate supplier the previous year.

Charles Beresford-Davies, Managing Director and UK Risk Management Practice Leader at Marsh, said: “No company operates in isolation. Every business, no matter how large or small, is part of a complex global network of suppliers, outsourcers and customers, all of which are subject to resiliency risk.”



If we have learned any lessons from the last few years, it is that data breaches present a significant business risk to organizations, often resulting in high financial cost and impact on public opinion. According to a recent study, the average cost of a data breach incident is approximately $3.5 million. With reputation management and a complex regulatory landscape as additive organizational concerns, security and risk professionals face the tough task of ensuring their companies successfully manage the aftermath of a data breach.

A crucial aspect to data breach preparedness is having a strong understanding of the legislative and regulatory framework around data breach notification. However, set against a patchwork of 47 existing laws from nearly every U.S. state, risk and compliance professionals are challenged with understanding and communicating rights for their business and customers. The recent mega breaches experienced by several large companies in the United States has resulted in heightened consumer, media and policymaker awareness and concern, making the potential for new requirements and legislation a hot topic.



Today, we’re adding metadata to the list of issues that will need to be addressed before data lakes are a useful, realistic concept.

Recently, I’ve been sharing the key concerns and barriers around data lakes. Data lakes, at least in theory, are what you get when you pull Big Data sets, including unstructured data, together. The idea is that data lakes will replace or at least supplement data marts for accessing enterprise-wide information.

Vendors have been hyping up data lakes, but many experts are questioning how realistic data lakes are right now. The challenge isn’t so much creating them as it is managing the data in a useful way, experts say.



Thursday, 19 June 2014 16:20

Maintaining Data Protection in the Cloud

Enterprises of all types and sizes are quickly ramping up their cloud presences, despite the fact that key questions regarding their reliability and efficacy remain.

A leading source of worry is data protection. Once data leaves the safety of the firewall, ensuring both its security and availability becomes largely a matter of trust.

Many organizations, in fact, are already struggling with the shift from an infrastructure-based protection scheme to a federated or virtual/application-layer solution, even without the cloud. As HP’s Duncan Campbell points out, the increase in data load and the already largely distributed nature of many enterprise data environments, not to mention the introduction of mobile communications, are forcing a rethink when it comes to maintaining access and availability. If you are looking at 20 to 25 percent data growth per year, how much longer will you be able to maintain local protection and security solutions at every remote site and branch office? At some point, the need for an integrated solution that cuts across geographic and infrastructure boundaries becomes evident, which is why the company developed the StoreOnce Backup solution with tools like federated deduplication, autonomic hardware restart and secure erase.



Six years after Hurricane Dolly struck a $1.35 billion blow to the South Texas coast, federally funded reconstruction efforts are just now getting under way for hundreds of Lower Rio Grande Valley residents whose homes were destroyed or badly damaged by the storm.

Nick Mitchell-Bennett, executive director of the Community Development Corporation of Brownsville, blames the situation on a “long and outrageously convoluted” federal, state and local process for getting help to storm-ravaged poor families.



After severe weather hit the state of Georgia earlier this year, Gov. Nathan Deal called for an improved emergency app, and on June 16, that app was released.

The upgraded Ready Georgia app maintains old features and adds several new features, including geo-targeted severe weather and emergency alerts that notify users based on their locations before an event, such as severe weather, occurs. Users can access live traffic maps and incident reports directly from the Georgia Department of Transportation, as well as obtain a map of local American Red Cross and approved Good Samaritan shelters, along with directions to those shelters from their location. 



CSO - A data breach like the one recently reported by AT&T demonstrates that security policies alone are only a paper tiger without the technological teeth to make sure they are enforced, experts say.

AT&T reported last week that unauthorized employees of one of its service providers accessed the personal information of AT&T wireless customers. The exposed data included social security numbers and call records.

AT&T did not say how many records were accessed, but the number was high enough that the carrier had to report the breach to California regulators.

While there was no indication of criminal intent, the service provider's employees "violated our strict privacy and security guidelines," AT&T said.



Multiple outbreaks of severe weather led to a costly month for insurers in the United States in May, as thunderstorm events continued to dominate the catastrophe record.

According to the latest Global Catastrophe Recap report by Aon Benfield’s Impact Forecasting, no fewer than four stretches of severe weather affected the U.S. during the month of May.

Aggregate insured losses exceeded $2.2 billion and overall economic losses were at least $3.5 billion, with large hail and damaging winds the primary driver of the thunderstorm-related costs, Impact Forecasting reports.

The costliest stretch occurred during a five-day period (May 18-23) which saw damage incurred in parts of the Midwest, Plains, Rockies, Mid-Atlantic and the Northeast, including the major metropolitan areas of Chicago, IL and Denver, CO.



Multiple outbreaks of severe weather led to a costly month for insurers in the United States in May, as thunderstorm events continued to dominate the catastrophe record.

According to the latest Global Catastrophe Recap report by Aon Benfield’s Impact Forecasting, no fewer than four stretches of severe weather affected the U.S. during the month of May.

Aggregate insured losses exceeded $2.2 billion and overall economic losses were at least $3.5 billion, with large hail and damaging winds the primary driver of the thunderstorm-related costs, Impact Forecasting reports.



ASIS has released a standard that provides guidance for establishing and managing an audit program, as well as conducting individual audits consistent with the ISO 19011 and ISO/IEC 17021 standards.

The latest in the five part series of ASIS resilience standards that offer a holistic, business friendly approach to risk and resilience management, the Auditing Management Systems: Risk, Resilience, Security, and Continuity - Guidance for Application American National Standard (SPC 2) will help practitioners evaluate risk and resilience-based management systems, establish and manage an audit program, conduct individual audits, and identify competence criteria for auditors who conduct conformity assessments of management risk and reliance-based management systems.

More details.

UK employees are potentially putting their companies at risk of cyber-attack when using mobile devices for work purposes while on holiday or on a short break, new research has found.

The ‘Beach to Breach’ research commissioned by Sourcefire, now part of Cisco, found that 77 percent of UK workers surveyed usually take their work devices with them on holiday, with 72 percent choosing to spend up to one or two hours per day keeping up with what’s going on in the office. Over 80 percent of directors, mid-managers and senior level employees admitted to taking their work device on holiday, and even the most junior employees are also keen to stay connected while away with 50 percent unwilling to leave their work device at home.



I don’t think anyone really thought that Hadoop and other Big Data technologies would liberate us from the basics of data, such as integration and governance. It was just so easy to ignore those issues in the heady first years of Big Data hype and pilot projects. Now, it’s time to do the hard work of figuring out how to make all this data useful.

And, frankly, the to-do list just keeps growing.

Data integration expert David Linthicum added his concerns about data integration tools in a recent Informatica blog post. Linthicum is piggy-backing on an idea proposed by analytics expert Tom Davenport. After interviewing data scientists for his research, Davenport concluded that the only way to support the demand for Big Data analytics is to provide the data scientists with better tools.



Computerworld UK — Companies that want to engage customers with wearables, but are worried about privacy issues, should run pilots with their employees first, a Forrester analyst has said.

Highlighting the success Virgin Atlantic has had with its Upper Class Wing Google Glasses pilot in Heathrow Terminal Three, Forrester analyst JP Gownder advised that arming customer-facing employees with wearables is the first step enterprises should be taking.

Virgin Atlantic's pilot saw business club lounge staff in Heathrow wearing devices to assist members with flight connection information, destination weather forecasts and restaurant suggestions.



The biggest threat to public sector data comes from employees, a new report suggest. Some 83% of the 141 senior public sector managers and other staff polled said they were most concerned about internal loss or misuse, with just 10% worried about the external threat posed by hackers.

Despite this, only 18% use a secure managed offsite records facility, with 41% storing data on-site and 21% relying on staff to dispose of documents using general waste, recycling bins and office-based shredding machines.

“Physical records stored within public sector buildings are extremely vulnerable to being lost or misplaced by employees,” says Anthony Pearlgood, managing director, PHS Data Solutions, which commissioned the research.



At a gathering of the UK’s risk managers today, Mike McGavick, CEO of XL Group, told risk managers “it’s a great time to be in your jobs, there is great opportunity for you to lead your organisations’ thinking about risk.”

Speaking on a panel debate focused on The State of the Insurance Market, at this year’s Airmic Conference, McGavick said: “Excess capital, the low interest rate environment and the mutation of risk means insurers have to dig deep, working harder to find differentiating solutions and services.”

“This environment provides risk managers with the opportunity to ask, what are we getting from you? And these searching questions are challenging insurers to innovate and stay relevant.



Wednesday, 18 June 2014 14:52

A New Data Center for a New Age

That the data center will have to evolve in order to keep up with changing application and data workloads is a given at this point. Static, silo-based architectures simply lack the flexibility that knowledge workers need to compete in a dynamic data economy.

But exactly how will this change be implemented? And when all is said and done, what sort of data center will we have?

According to a company called Mesosphere, the data center will become the new computer. The firm provides management software that helps hyperscale clients like Google and Twitter coordinate and pool resources across diverse application loads. By offering compute cluster, command line and API access to developers, the Apache-based platform enables broad deployment and scalability without the need for direct IT involvement. As well, it allows numerous low-level support tasks to be automated, essentially allowing users to call up applications or save data in the data center the way they do on a PC: Click the icon and let the system figure out the best way to handle it.



When you hear “public health,” you may think of flu shots. That’s one visible — and briefly painful — side of public health services. But if you’ve enjoyed tobacco-smoke-free air, thought twice about ordering a cheeseburger after seeing its calorie count on a menu, or not worried about tuberculosis in your community, you’ve also “used” public health services. These services are essential, ubiquitous and usually unnoticed.

They’ve also been hit hard by the recession. Since 2008 about 17 percent of the state public health workforce and 22 percent of the local public health workforce have been eliminated, according to a 2011 report from the Association of State and Territorial Health Officials. Several reports have enumerated how, as a result of these cuts, we’re more vulnerable to communicable diseases, water-borne infections and other health concerns.



Wednesday, 18 June 2014 14:49

The Many Paths to a Career in Risk

Over the years, I’ve had no shortage of people ask me how they can get my job as a senior risk leader. They see the possibilities and get a strong sense that risk management just might be a pretty interesting career track. Oftentimes these folks are sitting in some insurance related sub-function within the broader industry, anything from claims to loss control to underwriting and brokerage. Interestingly, many people who have had this experience (who are essentially developing specialists in these sub-functions) have frequently found that skill transferability from these specialized areas, to their “profession,” was often fraught with hurdles.

I have seen a parallel mind-set throughout much of my career in various industries in which I sought alternate employment. Most commonly it was in the manufacturing or health care sectors that insisted that any leader in their ranks, most especially a risk manager, needed to come from within their industry. They were the true believers and were typically inflexible about this minimum requirement.  They believed their industries were just too specialized and unique for a risk manager from another industry to succeed. They would argue that they didn’t want to invest in allowing the development of the full skill-sets or that their world could or should be learned by those coming from other industries, especially for a mid- to senior-level manager.



Wednesday, 18 June 2014 14:48

Lessons from Target’s Security Breach

There are times when major trends intersect. Sometimes they reinforce each other; other times they cancel each other out. In the case of Target’s security problems, there seems to have been a fair amount of interference (to read my earlier Advisor on the Target security breach, see “Cyber Security: Inside and Out“). The FireEye software that was supposed to warn of the kind of exposure that did Target in reacted as it was supposed to: the basic problem was flagged and diagnosed immediately, and a warning message was included in one of the security logs and highlighted by analysts at Target’s Bangalore security center. Unfortunately, the critical message was not deemed worthy of immediate action by the central security staff in Minneapolis.

As it turned out, there were multiple reasons that Target’s central security group didn’t follow up on the suspicious activity flagged by FireEye and the Bangalore team. One reason given for not acting was that the central team wanted to manually review all the critical flags. A second reason was that there was such an enormous number of flagged items on all different security logs that it was difficult to follow up on any but the most important ones in a reasonable time frame. (An interesting insight here is that the FireEye security monitoring software had the capability to automatically act upon finding specific problems, but again, the central team wanted to review this kind of problem. It may also have had something to do with the fact that the original breach was through a HVAC system, which may have seemed unlikely to cause widespread problems.)



A new handbook on Cyber Risk Oversight, designed to provide corporate directors with expert guidelines to improve their cybersecurity oversight, has been published by the American International Group (AIG), the National Association of Corporate Directors (NACD), and the Internet Security Alliance (ISA). The handbook is the latest issue in NACD’s Director’s Handbook Series.

The cyber threat is very real concern for business continuity professionals as identified in the 2014 BCI Horizon Scan Report with cyber attack and data breach featuring second and third respectively as the biggest threats to organizations. 73% of respondents to the survey expressed either concern or extreme concern at both these threats materialising. Such is the nature of the threat that it was the main topic of conversation in the launch edition of the BCI's Working Paper Series.



CSO — The pace of change for Information Technology is challenging established notions of "What is IT?" and "What is Information Security in the modern age?" For one example, the "new" data center technologies such as virtualization, Software-Defined Networking (SDN), service-oriented delivery models, and cloud computing have radically changed the typical IT infrastructure from a defined set of assets owned and controlled by the organization to a constantly fluctuating roster of resources that can come and go from IT department visibility and control.

As this has occurred, we have witnessed the equivalent of a Cambrian Explosion of new Internet-connected life forms--mobile devices, tablets, sensors, actuators, home appliances, monitoring systems, content access devices, and wireless terminals. Applications running on these devices range from recreation to services critical to the functioning of our social and economic infrastructure. Put it all together, and we expect that world population of Internet-connected devices will grow from today's 10 billion to over 50 billion by the year 2020.

From a security point of view, these IT changes, including the expansion of Internet-connected devices, lead to a corresponding increase in attack surface. Instead of the mission of protecting a reasonably known and enclosed IT perimeter, we now must be ready to secure any connected device humans can make against any threat a hacker can innovate. Clearly, using established security practices, except on a larger scale, will not suffice.



Despite growing levels of awareness and understanding of cyber risk among large and medium-sized corporations across the UK and Ireland, board-level ownership of the issue remains comparatively low with many firms relying on their IT departments for the strategic direction of their cyber risk strategies.

According to the Marsh Risk Management Research, UK & Ireland 2014 Cyber Risk Survey Report, cyber risk now features prominently on the corporate risk registers of organizations across the UK and Ireland, with one quarter (24 percent) of respondents placing it in the top five risks they face and over half (56 percent) placing it in their top ten.

However, Marsh’s research found that cyber risk is managed and reviewed at board level in just 20 percent of respondents’ organizations with 57 percent of respondents stating that the overall responsibility for the assessment and management of cyber risk lies with their IT departments.



Officials at the US National Institute of Standards and Technology (NIST) have announced plans to establish a new research Center of Excellence to work with academia and industry on disaster resilience.

NIST Centers of Excellence are meant to provide multidisciplinary research centers where experts from academia, industry and NIST can work together on specific high-priority research topics. The agency established its first such center, dedicated to advanced materials research, in December 2013.

The disaster resilience Center of Excellence will focus on tools to support community disaster resilience; and will work on developing integrated, systems-based computational models to assess community infrastructure resilience and guide community-level resilience investment decisions. It will also develop a data management infrastructure that allows for public access to disaster data, as well as tools and best practices to improve the collection of disaster and resilience data.


Only half of employees believe their workplaces are prepared for a severe emergency, according to the third annual workplace safety survey by Staples, Inc. Nearly two-thirds of those polled said recent natural disasters have not led to their employers reassessing company safety plans. The survey also reveals that in the past six months nearly half of businesses have closed due to severe weather, costing the economy nearly $50 billion in lost productivity.

Small business employees feel more at risk to emergencies and disasters than employees at larger companies. The survey found that workers at businesses with fewer than 50 people are less aware or less sure who is in charge of emergency planning than employees at larger companies. Employees from smaller companies report having less emergency equipment or plans in place, are less likely to do safety reviews or drills, and are less prepared for severe emergencies than their counterparts at bigger organizations.

About the survey
Staples conducted an online survey of more than 400 office workers and 400 decision makers at organizations of all sizes across the US. The survey, conducted in May 2014, asked a series of questions about general office safety.


Social media is increasingly being looked to as a tool for emergency management. It has a number of attractive characteristics, including cloud-based resiliency and being well-known and understood by a large portion of the public and professionals alike. The problem that many organisations face is in knowing how to prepare their use of social media. Trying to test the social media component of an emergency management plan is a delicate matter. Simply prefacing social media messages with ‘This is a test’ is optimistic at best.



Strange are the ways of the technology market gods. While technology itself follows a fairly predictable bell curve of hype, the terms seem to come and go in spurts.

Several years ago, experts and vendors, such as those in this Forbes piece, would often talk about “data lakes” as a way of explaining Big Data’s capabilities. Big Data was going to change everything: No more silos, no more separation of structured and unstructured data, and no more need for data marts.

It was more of a metaphor for the capabilities than anything specific, as I recall.



MEXICO CITY — The past two months have brought an unusual succession of earth tremors to the Mexican capital — and a business opportunity for Andres Meira.

Meira, a 39-year-old architect and social entrepreneur, started a company that produces a small earthquake alert that wails before a potentially destructive earthquake hits the capital.

For sale for about $54, Meira’s device costs a fraction of his competitors’.

That there is a market at all for such a receiver casts light on a quirk of Mexico’s pioneering seismic alert system, considered one of the most advanced in the world, and the unusual geologic conditions that cause Mexico City to shake even from distant quakes.



Computerworld UK — Software licenses for mobile users are a "grey area" legally, opening enterprises up to mounting costs unless a compromise with vendors is made, Forrester has warned.

Although software vendors have forecast that their spoils from maintenance support will grow, the reality is that companies are seeing diminishing maintenance budgets against increasing demands for technology to improve customer service. Therefore CIOs "must better align spending", analyst Duncan Jones said at the Forrester Forum for Technology Management Leaders in London this week.

Increasing mobile users are blurring the definition of what constitutes a separate user license, as software vendors like Oracle and SAP attempt to capture revenue from businesses' new mobile projects.



In this day and age of data center efficiency, just about every IT manager is familiar with the concept of hot aisles and cold aisles.  By directing proper air flow in and around racks of humming equipment, the enterprise is able to reduce operating expenses even as it increases utilization, and therefore heat generation, of key equipment.

What may not be widely known, however, is that there are numerous options when it comes to hot/cold designs, and what works for one facility may not be optimal, or even desirable, for another.

For example, some argue that the cold aisle containment portion of the equation may in fact be more crucial than hot aisle containment. According to Mark Hirst, head of T4 data center solutions at rack and cabinet designer Cannon Technologies, the difference comes down to the most effective use of cooling resources. Do you want cold air to go specifically toward data equipment, or do you want it to dissipate hot air in the room at large? Neither approach is wrong per se, although cold aisle containment does provide for faster cooling response in the event of sudden data spikes.