Spring World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Bonus Journal

Volume 29, Issue 5

Full Contents Now Available!

Industry Hot News

Industry Hot News (6945)

Wednesday, 26 August 2015 00:00

Five steps to protecting data in the cloud

Logicalis US says that there is a growing misperception that data that resides in the cloud is automatically protected just because it’s in the cloud. This, the company warns, is absolutely not the case.

“There’s a common misconception that placing your data in the cloud solves all problems, and that’s just not true,” says Eric Brooks, Cloud Services Practice Manager, Logicalis US. “Not all cloud providers are built to accommodate enterprise-level IT needs; many don’t provide the kind of advanced networking, backup or disaster recovery services you would expect to find in an enterprise IT organization. Don’t assume the cloud is somehow magic. When you consume cloud services, it’s critical to know what you are getting. You have to understand what inside of your business is driving the move to the cloud, and whether the services your cloud provider offers align with those business drivers.”

Cloud providers buy the same servers as their customers – just more of them. This means, the same issues a CIO might face in a corporate data center / cen6re regarding backup, disaster recovery and data retention can be amplified within a cloud provider’s environment.



The focus of most business continuity plans is operational resiliency.  How to keep operations running smoothly in the event of an unforeseen crisis.   But some crises don’t involve disruptions to service delivery.  When reputation management based crises occur, the steps for restoring marketplace trust are vastly different than those required for physical disruptions and/or power/IT failure.

Warren Buffet was quoted as saying “It takes 20 years to build a reputation and five minutes to ruin it.  If you think about that, you’ll do things differently.”

Reputation-based crises can be triggered by any number of events.  Some examples include:



As the amount of data that businesses generate continues to grow, managing the storage of that data has become an increasingly larger problem for the average internal IT organization. In fact, the time may have come to outsource the management of data storage altogether.

With that in mind, ClearSky Data today unfurled its managed cloud storage service based on a set of tiered services that are federated across a mix of solid-state and magnetic storage devices hosted in data center facilities owned by hosting services providers, such as Digital Realty, and cloud storage service providers, such as Amazon Web Services (AWS).

ClearSky Data CEO Ellen Rubin says that the volume of data that needs to be stored has reached a point where it is simply more efficient to contract a third party to manage it. Doing so then frees up the internal IT organization to focus more on adding value in terms of managing applications instead of dealing with lower-level infrastructure, says Rubin.



Some things never change, even in the cloud. No matter where data is stored, it still requires robust backup infrastructure that not only preserves data for the long term, but can also make it available at a moment’s notice.

This has become increasingly challenging, however, as both the data load and the operational complexity of the enterprise environment increase. And while the cloud does provide an answer to its own problem in the form of low-cost, flexible backup capabilities, it is by no means the only way to preserve data.

Ideally, the enterprise should implement a single backup solution for both on-premises and cloud infrastructure. This is the idea behind the recent partnership between Nexenta and Veeam Software, which unites the NexentaStor software-defined storage solution with the Veeam Backup & Replication platform. The combination allows organizations to extend backup and recovery operations across multiple storage tiers and targets in local or distributed infrastructure, while providing active management to continuously forward data to the most cost-effective storage solution based on utilization, data type and other parameters.



Have you tried VDI in the past, but felt the use cases were too limited?

Are you foregoing the security, mobility and flexibility benefits of virtual desktops due to a lack of in-house VDI skillsets, or because the cost and complexity is intimidating?

Are you interested in the simplicity of DaaS, but stymied by the lack of control over the desktop, moving everything to the cloud, inflexibility of deployment options, and the poor user experience delivered by DaaS providers?

If these or other reasons have been holding you back from deploying virtual apps and desktops or embracing cloud services, it’s time to put those experiences and doubts behind you.

Welcome to Citrix Workspace Cloud.



(TNS) - Sitting on a shelf in a county building in north Kalispell, there’s a four-inch-thick binder full of nightmare scenarios for Flathead County.

Avalanches, earthquakes and hazardous material spills are included. There are sections for nuclear emergencies and terrorist attacks.

These disaster plans are drawn up by officials who calmly look at agency resources and determine the best course of action. They look at how equipped medical responders would be if an earthquake brought a mountainside into the center of Columbia Falls, perhaps.

“We have to sit and we have to look at those worst-case scenarios,” said Nikki Stephan, 30, the emergency planner for the Flathead County Office of Emergency Services.



SAN DIEGO, Calif. – This is Part 2 in a series that explores the innovative and highly effective ways that organizations can strengthen their response to a cyber-attack. This series is written by CAPT. Mike Walls, former Commander of U.S. Navy Cyber Readiness and current Managing Director, Security & Operations at EdgeWave.

Most IT professionals will tell you that regular network vulnerability assessments are critical to good network hygiene. They will also tell you that periodic penetration tests are a good idea. But these techniques are only snapshots in time and do not measure or replicate the broader organizational impact of a breach.

The fact is that not even the most heavily resourced cyber defense capability will identify and defeat all adversaries at the network perimeter. Accepting the reality that at some point a hacker will be successful, organizations must prepare for sustaining critical business functions and operations while the Security and IT staffs are pushing the attacker off of the network. So how can a company do this? Let’s walk through a scenario which should answer the question…

Click HERE to read the full blog post

(TNS) -- Thanks to the proliferation of cellphones and access to social media, misinformation sometimes is conveyed to parents of Pasco County, Fla., schoolchildren during emergencies and nonemergencies.

In the past, a school participating in a safety drill received frantic phone calls from worried parents whose children texted them to say the school was in a lockdown situation, which means there is a direct threat to the campus, staff or students, Superintendent Kurt Browning said.

“You know how the game of rumor goes,” Browning said. “A student will (send a tweet, text or other message) about guns being on campus, when there are no guns on campus.

“We want to start pushing the right information out.”

To keep parents better informed about safety matters, the school district recently partnered with Pasco sheriff’s officials to develop an information system that will dispense fast, accurate information via Facebook, Twitter, Instagram and the Internet, officials said.



The Solutions Lab recently produced a document that provides single server scalability data regarding the running of XenApp and XenDesktop within a Federal environment.  

This environment was built out using Common Criteria Evaluation and Validation Scheme (CCEVS) guidelines provided by Citrix Security, as it pertains to Federal Information Processing Standards (FIPS) 140-2 compliancy and other essential components such as McAfee HBSS.

Test scenarios were designed to use various combinations of configurations and encryption types. Data collected during these scenarios shows the impact of a public sector normal configuration as it compares to the commercial base.

They were as follows:

  • Baseline
  • Common Criteria with HBSS
  • SecureICA
  • FIPS Internal (TLS + AES)
  • NetScaler FIPS Out + Basic In
  • NetScaler FIPS Out + FIPS In

The measurements and interpretation of the data gathered during this process and the explanation of the different scenarios and configurations used are fully documented in the Citrix XenApp 7.6 and XenDesktop 7.6 Public Sector Lockdown Design Guide.

Tuesday, 25 August 2015 00:00

Behavior Recognition as Cybersecurity Tool

If you think about it, many of the security incidents that companies deal with are a direct result of human behavior. Take phishing emails, for instance. You can put in all kinds of perimeter protection like firewalls, but that only does so much. Those who use phishing (or spearphishing) email as a form of attack aren’t worried about firewalls. They know that at some point the perimeter security will break down, and it will be that email versus the most vulnerable link in the security chain: the human being reading that email. According to an EnterpriseAppsTech article, the bad guys are targeting the weakest link on the network, and more often than not, that weakest link is an employee:

Since the target of these attacks is actually the user, it is the user that needs to be the first line of defense. Security awareness training, then, is the best defense against these attacks. The more end users are made aware of the risks, the more they will be able not to act in an impulse when pressed for information and will be able to evaluate better each request.



Even when you’re part of a large team, effective Business Continuity (BC) planning presents a series of very serious challenges. When you’re the only person in your organization responsible for BC planning, those challenges are magnified greatly. That’s why BC professionals Keith Cantando of CISCO Systems and Michael Lazcano of Gap Inc. – both users of ResilienceONE software – will join me to present revealing case studies on “Managing Global BC Programs as THE Lone Planner” at Disaster Recovery Journal (DRJ) Fall World 2015 on Monday, September 28 from 2:45 to 3:45 PM PST.

Attendees will learn both effective strategies and current BC best practices. These include how to gain executive support within your organization as an advocate for effective BC planning, critical time-management tips, and how to leverage other components of your organization to achieve results. During this interactive presentation and discussion, attendees will also learn how to effectively meet their own unique challenges with a tool like ResilienceONE that includes built-in intelligence and is ready to operate out-of-the box.

You can learn more about this exclusive peer-to-peer session here.

If you are not attending DRJ Fall World 2015 in-person, I will share the video-recorded session after the conference. Keep an eye out for my next blog or visit www.strategicBCP.com during the first week of October – when the video-recorded session will be available.

BILOXI, Miss. – More than $3.2 billion in FEMA funding has been allocated to Mississippi for Public Assistance after Hurricane Katrina. FEMA’s Public Assistance program includes grants for the repair and rebuilding of public infrastructure, such as bridges, roads, schools, hospitals and sewer treatment facilities. The PA program also provides funding for debris removal and emergency protective measures, such as search and rescue operations, temporary roads and overtime for other emergency workers, including police and firefighters. 

Some of the PA projects in Mississippi included repairing and rebuilding the Biloxi Civic Center and Library; the Hancock County Courthouse and Medical Center; the Waveland City Hall and Municipal Complex; and St. Martin School in Jackson County. The famous Biloxi Lighthouse, which came to represent the resiliency of the Mississippi Gulf Coast after Katrina, was also repaired with funding from FEMA’s PA program.

For more information on PA recovery projects in Mississippi, please go to FEMA’s Revitalizing Mississippi Communities.

The PA program normally reimburses local, state and tribal governments and qualified nonprofit organizations for a certain share of eligible costs. However, because of the magnitude of Katrina, FEMA covered 100 percent of allowable project expenses.

The largest PA project funded by FEMA in Mississippi after Katrina is the repair of a large section of Biloxi’s water and sewer treatment system. After Katrina, the system was out of operation for weeks. It was brought back into working condition with generators and temporary bypass pumps to transfer wastewater to treatment plants. City officials decided to use the $363 million in eligible FEMA repair and rebuilding grants toward the total cost of improving and upgrading the system. Repairs include consolidating and hardening the pump stations along the beachfront to withstand future storms.

Following is a breakdown of Public Assistance funding by sector.

Health Care: More than $50 million has been obligated to rebuild and improve hospitals and other health care facilities in disaster-affected areas. Over $40.6 million was used to restore parts of Hancock Medical Center in the town of Bay St. Louis which serves a population of approximately 44,000.

Education: More than $334 million has been allocated for K-12 public schools and universities. Some of these schools, such as the St. Martin school in Jackson County, combined the funding with other sources to build new state-of-the-art educational facilities and a public safe room to protect the community from future disasters.

Public Works/Utilities: Nearly $901.6 million has been obligated. This includes more than $363 million to repair and rebuild part of Biloxi’s sewer and water treatment system and nearly $99 million for the sewer and water treatment system in Gulfport. More than $36 million funded the repair of the wastewater treatment facility in Diamondhead.

Roads and Bridges: More than $84 million was allocated for repair and rebuilding.

Public Safety and Protection: Over $33 million has been obligated for the restoration of fire and police stations, courthouses and corrections facilities. Some public safety buildings, such as the Pass Christian police station, were rebuilt using FEMA 361 standards for public safe rooms to protect first responders in future disasters.

Historic Structures: Over $129 million was obligated for restoration of historic properties, such as the town of Waveland’s Civic Center, the Carnegie Library in Gulfport and the Old Brick House in Biloxi.

Debris Removal/Emergency Protective Measures: More than $1.15 billion was allocated to clear debris and reimburse overtime hours for emergency workers, including police and firefighters.


 FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

(TNS) - Dozens of wildfires continued to ravage the Pacific Northwest, particularly Washington state, as more firefighting equipment and manpower arrived from across the nation over the weekend.

Twelve uncontained fires in Washington covered more than 600,000 acres Sunday, according to the National Interagency Fire Center. Gov. Jay Inslee’s office said more than 200 homes had been destroyed and 12,000 homes remained threatened.

More resources to battle the blazes became available after the Obama administration declared a federal state of emergency in the area last week.

National Guard Blackhawk helicopters from Colorado, Minnesota and Wyoming headed over, Inslee’s office said in a statement. An incident management team from San Diego went north to oversee a new staging area at Fairchild Air Force Base near Spokane, the governor’s statement said, and 20 large fire engines “specifically designed to protect threatened communities and residences” were coming from Arizona, Colorado, Nevada and Utah.



Tuesday, 25 August 2015 00:00

Weaker Danny a Threat to State

(TNS) - Once powerful Hurricane Danny continued to weaken, dropping to tropical storm strength late Saturday — but remaining on a track that could steer it or its remnants toward South Florida by late this week.

Forecasters said the storm — with maximum winds dropping to 60 mph in the National Hurricane Center’s 11 p.m. update — was expected to continue weakening over at least the next two days.

“By Thursday, it is expected to be a depression,” said National Weather Service meteorologist Dan Gregoria.

The five-day track shows that Danny, an unusually small storm, could be anywhere between Cuba and the Bahamas by Thursday afternoon. If the storm holds together, South Florida could be poised to fall in the five-day forecast cone.



When Hurricane Katrina slammed into New Orleans 10 years ago this week, countless businesses were submerged in water and were partially or totally destroyed. Those without reliable disaster recovery operations in place paid an enormous price. In many ways, those with reliable disaster recovery operations in place paid an enormous price, too. But at least for them, their core business operations were unscathed.

A compelling example of the rewards of that foresight and preparedness is the case of Cooperative Processing Resources (CPR), a debt management system software provider in Richardson, Texas. CPR, which oversees a network of agencies that provide credit counseling services to consumers, has a New Orleans office that took a direct hit from Katrina, and was shut down. The good news is that the services provided by that office were back up and running within an hour.

That near-instantaneous recovery was made possible by the data hosting and disaster recovery operation that CPR has had in place since 2002 with Wayne, Pa.-based Sungard Availability Services. I recently had the opportunity to speak with CPR president Kate Campion about her company’s disaster recovery operations, and I began the conversation by asking her about the backup strategy the company had in place since its inception in 1994, and before Sungard AS came along. For many IT veterans, her response will bring back a lot of memories:



From the Middle East to Eurasia to Eastern Europe, events and potential events that translate into political risk fill the news.

Political risk is instability that damages or threatens to damage an existing or potential asset, or significantly disrupt a business operation. Examples include sustained political and labor unrest, terrorism and violent conflict. This risk is increasingly regional in nature, as the Arab Spring and sudden spread of Islamic State control demonstrate.

According to the new Clements Worldwide Risk Index, political unrest is the number one concern among top global managers at multinational corporations and global aid and development organizations.



You can create a Citrix XenDesktop proof of concept deployment with just a few clicks. Don’t believe it? What if we could show you how in just four straightforward steps?

You can. Welcome to Citrix Lifecycle Management.

What is Citrix Lifecycle Management?

Citrix Lifecycle Management is a comprehensive cloud-based lifecycle management solution to accelerate and simplify the design, deployment and ongoing management of Citrix workloads and enterprise applications.

Supporting many types of IT workloads across virtual and private or public cloud environments, this solution enables IT organizations to become faster, more cost-effective and more agile, and it helps maintain service quality and high availability with redundancy, automatic scaling and disaster recovery of applications. Built on blueprints incorporating validated reference architectures, configurations and best practices, Citrix Lifecycle Management provides a unified and standardized set of management tools for rapid and best practice-driven design, deployment and management of Citrix workloads and enterprise applications.



Business continuity consultant, Charlie Maclean-Bristol FBCI, recently conducted a response exercise using cyber attack as the scenario. In this article he captures ten lessons learnt from conducting the exercise:

Lesson one: I don’t think you need to be an IT security expert to conduct a cyber attack exercise. The technical element of the exercise is done by IT, and if you are looking at the first 24 hours of an incident then you don’t have to be too specific about how the attack took place, just about what the consequence of the attack was.

Lesson two: To be credible you have to do some reading on how other attacks have taken place, what the consequences of them were, and how to respond to them. There is a lot of guidance on the web about this so it is not very difficult to get yourself up to speed on the subject. One particular document I thought was useful was the National Institute of Standards and Technology (NIST) ‘Computer Security Incident Handling Guide.’ It is reasonably technical but it contains lots of useful advice for those who are non-technical.



Mass shootings this summer resulting in multiple deaths are prompting local leaders to educate the public on ways to protect themselves should they encounter an active shooter situation.

“One of the primary purposes is to get churches to understand that they need to develop an emergency operations plan for their house of worship. That will include active shooter and a lot of other emergency situations,” one of the forum’s presenters, Rev. David I. Fox, retired Wilberforce University Police chief, said.

Fox, whose law enforcement career has spanned 40 years, said he will be sharing the precepts of A.L.I.C.E., which stands for Alert, Lock down, Inform, Counter and Evacuate.

Anyone caught in an active shooter situation should first try to get out and second call for help, Fox said.



Monday, 24 August 2015 00:00

Data: its security and encryption

The data security is essential to guarantee the confidentiality of the information, especially in the age of anonymous, identity theft and hacking. It should be a major concern for anyone who lives at least part of their life online.

But how do we address this? Namely by encrypting the information that we digitally send around the world. But encryption isn’t without its drawbacks.

The risks of the encryption of data

When assessing the risks of encryption, we first need to assess one thing: the level of encryption. The level of encryption is generally determined by the number of bits which will be used to create an encryption key which will then contain a whole series of equations to transform a deciphered text into a ciphered text. Fortunately, there are pre-existing keys and official algorithms, such as the AES (Advanced Encryption System that dates from 2001), for example (which replaces the standard OF created in the 70s), that are used in numerous transactions in SSL (Secure Socket Layer, which includes the authentication) on the Internet. AES exists in 128, 196 or 256 bits and is very strong; its robustness depends on its algorithm, but also, naturally on the number of bits used for its key.



Friday, 21 August 2015 00:00

Infographic on Hurricane Katrina

The Insurance Information Institute (I.I.I.) is looking back at the costliest hurricane in U.S. history that took 1,800 lives and cost $125 billion in total economic losses, via a comprehensive infographic.

Insurance claims by coverage and state, total National Flood Insurance Program losses from Katrina, and other sources of Katrina recovery funds are all detailed.

Another compelling section to the infographic asks where are we now?

One of the fascinating analogies it draws is that even as awareness of flooding due to coastal storms rises, so too does the population of coastal communities.



Friday, 21 August 2015 00:00

What Big Data Means to Business

To the enterprise, the words “Big Data” mean a lot of things. It can represent vast amounts of unstructured data from a variety of sources or it can be large volumes of consumer Internet data. It may also represent the need for upgraded IT infrastructure and tools with which the data can be wrangled, stored and analyzed. The point is, it can mean something different to different parts of the business.

In many enterprise organizations, it is marketing and sales that drives the need for Big Data projects. These departments are backed by the C-level executives who are pushing IT to bring the company’s systems and infrastructure in line with what is needed to handle Big Data and be able to analyze and gather actionable information from it to help the business not only provide better services, but gain customers, too.



Friday, 21 August 2015 00:00

API Security Needs to Be Backed by CXOs

I don’t think I’m off base saying this, but in our current Internet security culture, it seems like threats and other issues are taken seriously only when top management begins to recognize the problem. And as we know, C-level executives are almost always the last ones in the company to jump on the security bandwagon.

So, when CXOs do pay attention to a security problem, you can be pretty sure that it is the real deal.

Application program interface (API) security is one such threat. At the Black Hat USA 2015 conference earlier this month, Akana released the results of its survey, Global State of API Security Survey 2015, and it found that API security is becoming a C-level concern, even before it becomes, as ProgrammableWeb put it, a “full-blown crisis.”

According to the study, 75 percent believe that API security has to be a CIO-level concern. But at the same time, 65 percent said that processes aren’t in place to ensure that data accessed by applications is kept secure, and another 60 percent aren’t doing anything to secure API consumers.



(TNS) -- At the heart of the newest building on the Pacific Northwest National Laboratory campus is an operations center focused on making the largest and most complex machine ever created more secure and more reliable.

“The Systems Engineering Building is a really generic name for a really cool place,” said Elizabeth Sherwood-Randall, the nation’s deputy energy secretary, at the building’s dedication Wednesday.

“I wish it sounded sexier so people could appreciate what amazing things are going to happen inside of it.”

The complex and piecemeal grid system that delivers electricity across the nation is largely a 20th century structure, she said, and it won’t meet many of the demands or the opportunities of the 21st century.



Flood insurance can save Texas homeowners and renters thousands of dollars in repairs. It also can provide peace of mind considering that flooding is the most frequent natural disaster in the United States.

Flood Insurance in Texas:

  • Flooding comes from a variety of sources in Texas, such as rainstorms, tropical storms, and hurricanes.
  • Last year, the National Flood Insurance Program (NFIP) paid out more than

$58.5 million in claims for Texans. So far this year, the agency has paid out more than quadruple that amount – exceeding $277.6 million, as of Aug. 19.

  • Nearly 600,000 Texas households had flood insurance as of May 31, according to the NFIP. While that number may seem large, it is a small percentage of the 8.9 million total Texas households.

Costs Add up Quickly:

  • Just three inches of floodwater in a home will require replacing drywall, baseboards, carpet, furniture and other necessary repairs that can cost $22,500 in a 2,000-square foot house.
  • The deeper the floodwater, the higher the repair costs – 12 inches of water in a 2,000-square foot house can cost $50,000 or more.

Common Misconceptions:

  • Understanding the value of flood insurance is important, yet many people remain uninsured, in part due to common misconceptions.
  • Many policyholders believe their insurance covers all hazards and flood insurance isn’t needed. However, standard homeowner policies do not cover flooding.
  • A federal disaster declaration is not necessary to make a claim on an NFIP flood insurance policy.
  • Homes located outside flood-prone areas need flood insurance, too. Nationally, 25 percent of the total structures that flood each year belong to policyholders whose properties are not in high-risk areas.

Obtaining Flood Insurance:

  • There is normally a 30-day waiting period when purchasing a new policy. Flood insurance is sold through private insurance companies and agents and is backed by the federal government.
  • Flood insurance is available to homeowners, business owners and renters in communities that participate in the NFIP and enforce their local flood plain management ordinances. To determine if a community participates in NFIP, go online to www.floodsmart.gov
  • Homeowners in a Special Flood Hazard Area (SFHA) must buy flood insurance if they have a mortgage from a federally regulated lender.
  • An interactive guide to determine flood risk is available online at www.floodsmart.gov. This site also provides additional information on the NFIP and a list of insurance agents in a homeowner’s area who sell NFIP flood coverage.

Costs and Coverage:

  • Flood insurance premiums average about $700 per year for homeowners.
  • Homeowners can insure their homes for up to $250,000 and contents for up to $100,000.
  • A number of factors determine rates for renters. Renters can cover their belongings in amounts up to $100,000.
  • Nonresidential property owners can insure a building and its contents for up to $500,000 each. 


FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

BILOXI, Miss.--In the last 10 years, FEMA’s Hazard Mitigation Grant Program has obligated more than $159 million from Hurricane Katrina recovery to build community safe rooms throughout Mississippi to protect people during storms. HMGP provides grants to state, local and tribal governments to implement long-term mitigation measures to reduce the loss of life and property from a disaster.

Safe rooms can be built as multipurpose shelters to protect communities from tornadoes, hurricanes and floods. These community safe rooms are built to FEMA 361 specifications, which include hardening of walls and roofs to withstand 200 mph winds.

Mississippi Emergency Management Agency officials have made the construction of safe rooms a priority since Katrina. A recent study from the Centers for Disease Control found that safe rooms are the best option to reduce the number of deaths during tornadoes.

“We always tell folks to get out of mobile homes and manufactured homes, and to consider going to a more substantial structure to wait out the storm,” said Robert Latham, executive director of MEMA. “By providing a secure place for them to go, we make our communities safer. Citizens need to incorporate safe room locations into their plans, or know where a substantial structure is located.”

“In so many cases, the death toll would be much higher were it not for safe rooms for people to take shelter in,” said Acting Director of FEMA’s Mississippi Recovery Office, Loraine Hill.

To date, 42 public safe rooms have been added to schools; 34 have been built as stand-alone structures for general use, and 9 constructed for first responders. Populations served by these safe rooms include approximately 44,000 students and staff; 28,000 citizens in the general population, and 3,500 first responders.

During the threat of an outbreak of tornadoes in the state earlier this year, 70 residents sought shelter; in a community safe room in Rankin County, built to FEMA 361 standards.  

Another $205 million in HMGP funding was made available to Mississippi for mitigation projects, such as elevating buildings, flood control, sirens, generators and grants to individuals to retrofit areas of their home or build stand-alone safe room units.

For more information on building a public safe room to FEMA 361 specifications, go to http://www.fema.gov/media-library/assets/documents/3140.

A video on community safe rooms in Mississippi.


 FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

(TNS) - On a recent Friday night, while many were kicking off the weekend with an evening on the town, Jenyne Wells waited for the phone to ring.

As a 911 call-taker at SunComm, Yakima County’s emergency dispatch center, she was waiting for anyone needing help.

It would not be long.

“A young girl is being forced into a van,” a woman caller said, speaking through an interpreter. The abductors are fighting and are known to carry guns, the woman said.

Wells calmly but authoritatively asks the woman to describe the vehicle and assures her police are on the way. All the while, she’s typing bursts of information that are transmitted to computer screens in the responding police car, as well as police dispatchers who are sitting on the other end of the center.



(TNS) - Tropical Storm Danny strengthened early Thursday to become the first hurricane of the Atlantic season, National Hurricane Center forecasters said.

Sustained winds increased to about 75 mph as Danny headed north, northwest at 12 mph, about 1,000 miles east of the Windward Islands, forecasters said. They expect Danny to continue strengthening at least for the next two days.

The compact storm has hurricane force winds extending only 10 miles from its center, with tropical storm winds reaching about 60 miles.

Hurricane Danny arrives just as the season swings into peak months. On average, the first hurricane of the season forms on Aug. 10.



Today, many organizations are under continuous attack from nation-states or professional cyber criminals. One of the main focuses for IT security teams is stopping intruders from gaining access to assets on the corporate network. However, this strain on IT teams means that when it comes to malicious insiders, a worrying number of organizations almost entirely drop their guard.

An insider attack is one of the biggest threats faced by organizations since these types of hacks can be very difficult for IT teams to identify. This is because an insider – whether he’s an employee or a contractor – is already entrusted with authorized access to at least some systems and applications on a corporate network. It can be very hard for those in IT to decipher whether he’s just performing his regular job tasks, or carrying out something sinister.

Insiders have been responsible for some interesting breaches or hostage scenarios in recent history, whether intentional or not. Consider Terry Childs in San Francisco who held the city hostage for two weeks while sitting in a jail cell or Edward Snowden, formerly of the NSA.

So, which is a bigger threat - an external hacker or a disgruntled employee?



IT administrators are realizing that application deployments are getting more complex and error-prone than ever before. Additionally, the deployment of Citrix workloads and enterprise applications is only the first step in the lifecycle of applications. Once the workloads are deployed, IT administrators must continuously monitor the health of the workloads to ensure they are running at optimum performance, can scale efficiently to meet changing demands and are always available to end users even in case of application component or environment failures.

What if there were an integrated application service lifecycle management solution that empowers IT to streamline design, deployment and management of a broad array of application workloads across hypervisor or cloud environments, all through a single integrated console?

Now there is! We are excited to announce the General Availability of Citrix Lifecycle Management!

With Citrix Lifecycle Management current and new Citrix customers can:



Storage is expensive. That raises a question for anyone engaged in cloud backup or considering hosting a backup service for others: Should we keep one copy of data or two?

The immediate answer is two, but economics can enter in. Keeping a single copy of data will be the most cost-effective approach, but there is a downside: It leaves you without any insurance should something happen to the primary copy. During Superstorm Sandy, for example, a New York healthcare firm found itself badly exposed when its DR site in New Jersey suffered from flooding. It took it more than a week to get its systems up and running, at a colossal loss of revenue.

That’s why keeping two copies is the best bet. But that comes at a price, as building and maintaining this infrastructure means doubling your storage acquisition and operating costs.



If you blinked, you would have missed it. That’s how a future tech historian will probably characterize the virtualization era now that containers have emerged as the preferred architecture for data-driven applications.

This isn’t to say that virtualization will no longer be a part of enterprise infrastructure, but that the development and productivity gains will soon move off the virtual layer to a more container-based data stack.

Tech industry analyst Janakiram MSV points to five key signs that the enterprise is on the cusp of a post-virtual environment. Not only is the virtualization market fully saturated by now with more than 75 percent of the enterprise workload now residing on virtual servers, but companies like Docker have demonstrated the efficacy of containers so effectively that even stalwart virtualization backers like VMware and Microsoft have jumped on the bandwagon. At the same time, organizations like the Cloud Foundry Foundation, the Open Container Initiative and the Cloud Native Computing Foundation are starting to coalesce around a new computing paradigm based on containers and container management to accommodate emerging Big Data and mobile applications.



Hurricane Katrina, which pummeled the Gulf Coast of the United States 10 years ago on Aug. 29, has proven to be the deadliest and costliest disaster on record. The 2005 Atlantic hurricane season was the most active in recorded history with more than 30 tropical and subtropical storms, including 15 hurricanes.

According to the study, Hurricane Katrina 10: Catastrophe Management and Global Windstorm Peril Review by Allianz Global Corporate & Specialty, it was predicted that hurricanes would become more frequent and intense after 2005, however, “In reality, the exact opposite has occurred,” Andrew Higgins, technical manager, Americas at Allianz Risk Consulting explained in the report. Instead, there has been a reduction in Atlantic hurricane activity during the last 10 years, with 2013 seeing the fewest Atlantic basin hurricanes since 1983. “These results illustrate the fact that we do not fully understand the complex climate variables that affect hurricane activity,” he said.

Because Katrina’s impact was so devastating and widespread, many changes have since been made. New Orleans has built a new system of levees, for example. Flooding caused by Katrina revealed the state of the levee systems in the U.S. to be substandard and in need of repairs estimated at $100 billion,the National Committee on Levee Safety found. “There are many levee systems throughout the U.S. that would reveal similar deficiencies if subjected to the same level of scrutiny as those in New Orleans,” according to the study.



(TNS) - When the U.S. Department of Agriculture denied Iowa Gov. Terry Branstad’s request for an avian influenza (AI) disaster declaration, it led Iowa farmers and state-level industry organizations to focus on rebuilding the state’s poultry production industry, which helps account for thousands of jobs.

Not only did the governor make a decision in late July to extend the state’s disaster declaration for a third time, to Aug. 30 instead of its planned July 31 expiration, but also, farms have begun to repopulate their flocks.

The extension basically gives agencies and organizations dedicated to stopping the outbreak the resources and authority needed in an emergency.



Thursday, 20 August 2015 00:00

Mainframe Resurgence: Big Iron for Big Data

The mainframe is back in business in the enterprise, a development that comes as a shock to those who predicted that the cloud would have taken over by now.

In reality, the mainframe was never absent from the enterprise, at least in the really large ones that need to pack substantial amounts of computing power. But now that scale and modularity are in big demand, many organizations are looking at the mainframe as a base on which to build Big Data infrastructure.

This is good news for IBM, of course, which has steadfastly supported the mainframe during the decades when distributed blade architectures were all the rage. The company recently launched two new mainframe models, the Emperor and the entry-level Rockhopper, running the new LinuxOne operating system based on Canonical’s Ubuntu distribution. The combo is targeted toward the rising cadre of Big Data tools, such as Apache Spark, MongoDB and PostgreSQL, and will likely become the focus of IBM’s contribution to the new Open Mainframe Project that looks to do for the mainframe what Google’s Open Compute Project is doing for scale-out commodity infrastructure.



Thursday, 20 August 2015 00:00

Could hackers take down a city?

First the power goes out. It's not clear what's gone wrong, but cars are starting to jam the streets -- the traffic light are down. And something seems to be going haywire with the subways, too.

No one can get to work. And even if they could, what would they do? A cyberattack has driven the city to a halt.

Of course, that hasn't happened yet -- and to a lot of people the idea of malicious hackers taking down a city still sounds like a bad movie plot. But it may not be as crazy as it sounds, according to security experts who say cities' increasing dependence on technology and the haphazard ways those systems sometimes connect could leave them vulnerable to someone looking to cause chaos.



Thursday, 20 August 2015 00:00


The Lafayette, Louisiana movie theater shooting was methodically planned. The shooter, John Russell Houser, is reported to have visited other theaters prior to the attack and had been to the Grand 16 movie theater in Lafayette at least once prior.  So, why did he choose this Lafayette theater over the others that he visited? Did the others have better security or at least portray that they were more secure and better prepared?

When analyzing cases like Lafayette, we often hear about what a person with malicious intentions did in the build up to the attack. This can be a physical or a digital (cyber) related event. The hostile might want to kill, or he could be looking for trade secrets and intellectual property, or to cause embarrassment in the case of most cyber-attacks. Whatever the reason, they all have one thing in common – they conduct a period of digital surveillance as part of their initial planning process.

What can movie theaters and other businesses do to help protect themselves against such tragedy and threats against their employees, customers, and other assets? Evaluating your organization’s digital profile is the first place that you can start to protect yourself against hostiles.



Wednesday, 19 August 2015 00:00

5 Ways SMBs Benefit from Embracing the Cloud

Small Biz Technology recently declared “Netflix Got Rid of Its Servers. When Will You? Cloud Rules. Servers Don’t.” And it’s true. Besides getting rid of on-premise servers, the cloud is a good choice for small to midsize businesses (SMBs) for many reasons.

Number one, choosing cloud options can save SMBs money. Growing businesses can invest in desktop cloud services, which allows them to add users as the company grows. In much the same vein, putting data in the cloud allows for scalability as a company’s data needs expand, but it also contributes to savings on power and hardware purchases. And if you opt for managed services through a cloud provider, you won’t need to pay for as many members of IT staff to manage it all.

As reported by Business2Community, data security is another reason SMBs turn to the cloud.



The data center is becoming more software-defined, with distributed, cloud-based architectures making bricks-and-mortar facilities appear more and more like single computing units, basically building-sized PCs, tied to a globally networked infrastructure.

So it shouldn’t come as any surprise that the selection of the software platform, or operating system, for the data center is emerging as one of the most important decisions on the agenda, eclipsing concerns about server, storage and networking hardware.

At this point, it seems that the only certainty when it comes to data center software is that it will have to be based on open standards. That makes Linux the default choice, given that it already owns a good chunk of the legacy data environment. Red Hat executives have not been silent on this subject, with top names like Paul Cormier, president of worldwide products and technology, crowing at the company’s 2015 Summit in Boston recently that “Linux has won the data center.” The next step, he says, is to push open source across the entire operating and application development infrastructure.



Organizations of all sizes, across all industries have become data breach victims as cyber crooks become more sophisticated in identifying vulnerable targets. Attackers can compromise an organization within scant minutes in 60% of breaches, reports the latest Verizon Data Breach Investigations Report. Still, insiders persist as one of the biggest fraud perpetrators, costing organizations globally about $3.7 trillion annually in 2014, estimates the Association of Certified Fraud Examiners. The puzzling question is this: With the advances in technology, why aren’t organizations preventing these incidents and why aren’t the offenders being nabbed earlier?

The answer to the insider fraud dilemma lies in a lag in robust risk-management technologies that help organizations identify and prevent insider fraud, especially in such industries as banking. With this type of breach, tracking behavior becomes a key component of managing risks and threats proactively. While basic data tracking isn’t new, what is fresh is grasping the internal behavior of employees in a real time, comprehensive view across multiple platforms and applications.



As cloud computing continues to transform how business is being conducted, a lot of attention has been paid by managed service providers (MSPs) to external and technical security threats. Almost all cloud-based file sharing systems have very powerful security features. You can hardly find an MSP who does not offer two-step authentication, robust encryption and periodic data backups. After all, no company will even bother knocking on your doors if you cannot convince them their data will be in safe hands.

But while technical security features and jargon might instill a measure of trust in your customers, have you really considered the threat your own organization might pose to your service?

A well-motivated workforce has the potential to transform a business landscape for the better and drive an organization to success. But, it only takes one disgruntled employee to send your company back to square one.



Since I first become the research director of the Security & Risk team more than five years ago, security leaders have lamented the difficulty of aligning with the business and demonstrating real business value. Over the years, we’ve written an enormous amount of research about formal processes for aligning with business goals, provided key metrics to present to the board, and developed sophisticated models for estimating security ROI. Yet for many, demonstrating real business value continues to be a significant challenge. If it wasn’t for the 24 hour news cycle and a parade of high profile security breaches, chances are good, that security budgets would have been stagnant the last few years.

Why is business alignment and demonstrating business value so hard? It’s because for too long, security leaders have focused on managing regulatory risks at the lowest possible cost, and securing corporate perimeters, back-end systems of record, and data center infrastructure. Security leaders have not been working with counterparts in the business and marketing leaders to champion privacy, embed controls directly into customer-facing products and services as a competitive differentiator or to help them identify, analyze, and mitigate risks in the customer life cycle. If your security priorities and investments don’t focus on helping your firm win, server, and retain, customers, and thereby increasing your firm’s top line growth, then I’m not surprised if demonstrating business value is an issue for you.



SAN DIEGO, Calif. – This is Part 1 in a 3 Part series that explores the innovative and highly effective ways that organizations can strengthen their response to a cyber-attack. This series is written by Capt. Mike Walls, former Commander of U.S. Navy Cyber Readiness and current Managing Director, Security & Operations at EdgeWave.

If I were to ask an IT Professional to explain why his or her network is secure, I would probably hear a response that goes something like, "I have the latest and best technology, I do regular vulnerability scans, I do an annual penetration test, and I am in compliance with my industry's security requirements and standards." At face value, that sounds like a solid answer and it appears that the IT Professional is taking the necessary steps to ensure that his company's network is secure. In reality, it is more likely that this answer is only partially correct…

Click HERE to read the full blog post

The British Security Industry Association (BSIA) has published a new guide that aims to help organisations in the public sector better manage the secure deletion of their data.

Called Information Destruction in the Public Sector and published on July 28th, the whitepaper is based on official guidance from the Cabinet Office and the Centre for the Protection of National Infrastructure (CPNI), including the CPNI benchmark Secure Destruction of Sensitive Items.

It explains the differences between Top Secret, Secret and Official classifications for government information, as well as their respective requirements when it comes to disposing of printed and digital data.

Commenting on the publication, chairman of the BSIA Information Destruction section Adam Chandler said that data breaches can “ruin a government’s credibility as well as a private company’s reputation”.

“By adhering to the standards set by the government and referenced by the BSIA in this paper, citizens, employees, and civil servants will be better protected,” he added.

When it comes to the secure deletion of end-of-life data, you can rely on the accredited software and hardware from Kroll Ontrack.

From:: http://www.krollontrack.co.uk/company/press-room/data-recovery-news/bsia-publishes-secure-data-destruction-guide876.aspx


♦ SBA offers low-interest disaster loans to businesses of all sizes, most private nonprofit organizations, homeowners and renters.

♦ Businesses may borrow up to $2 million for any combination of property damage or economic injury.

♦ SBA offers low-interest working capital loans (called Economic Injury Disaster Loans) to small businesses, small businesses engaged in aquaculture and most private nonprofit organizations of all sizes having difficulty meeting obligations as a result of the disaster.

♦ If you are a homeowner or renter, FEMA may refer you to SBA. SBA disaster loans are the primary source of money to pay for repair or replacement costs not fully covered by insurance or other compensation.

♦ Homeowners may borrow up to $200,000 to repair or replace their primary residence.

♦ Homeowners and renters may borrow up to $40,000 to replace personal property.


♦ Begin by registering with FEMA. If you haven’t already done so, call (800) 621-FEMA (3362) or visit www.disasterassistance.gov.

♦ Homeowners and renters should submit their SBA disaster loan application, even if they are not sure that they will need or want a loan. If SBA cannot approve your application, in most cases they will refer you to FEMA’s Other Needs Assistance (ONA) program for possible additional assistance.


We encourage every individual and business owner to come into the Disaster Recovery Center and speak one-on-one with an SBA Customer Service Representative. Our representatives will answer all of their questions and explain the application process. We will also help each business owner and homeowner complete their application to apply for a low-interest disaster loan.

The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners, and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling (800) 659-2955, emailing This email address is being protected from spambots. You need JavaScript enabled to view it., or visiting SBA’s website at www.sba.gov/disaster. Individuals who are deaf or hard of hearing may call (800) 877-8339.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

In theory, creating privileged access accounts to the most critical areas of your company’s network is supposed to add a layer of security to sensitive data or infrastructure. However, these accounts are difficult to completely lock down and thus could be a data vulnerability for many enterprises, says TechTarget’s SearchSecurity:

In the wrong hands, privileged accounts represent the biggest threat to enterprises because these accounts can breach personal data, complete unauthorized transactions, cause denial-of-service attacks, and hide activity by deleting audit data. Privileged accounts, such as the UNIX root, Windows Administrator accounts or accounts associated with database ownership and router access, are required for platforms to function. Moreover, they are required for ‘break the glass’ emergency access scenarios as well as more mundane day-to-day tasks.

A survey conducted by Thycotic of 201 hackers at Black Hat USA 2015 found most agreed that privileged accounts aren’t as secure as we think they are, and that little has been done to improve on such account security in recent years. According to the survey, despite an increase in security spending, 75 percent of hackers haven’t seen any real change in the level of difficulty in compromising privileged account credentials. In fact, the vast majority said it may be even easier to hack into these accounts than it was just a couple of years ago.



(TNS) - It used to be that a "Visitors must report to the office" sign was enough.

That was before multiple shootings at Columbine High School, Virginia Tech and other schools made the conversation surrounding school security at all levels more urgent.

The events of Dec. 14, 2012, brought that difficult conversation closer to home. Adam Lanza, 20 at the time, used a semi-automatic AR-15 assault rifle to shoot his way into Sandy Hook Elementary School in Newtown, killing 20 first-graders and six educators before turning the gun on himself.

In the aftermath, sections 86 and 87 of Public Act 13-3 became law, giving all public school districts in the state until July 1, 2014, to create school security and safety committees and to develop school security and safety plans.



BILOXI, Miss. – It’s been nearly ten years since Hurricane Katrina left widespread destruction along the Mississippi Coast. In the storm’s path, more than 234,000 homes were damaged or destroyed and more than one million people, a third of Mississippi’s population, were affected.

During the ten-years of recovery, the Mississippi Emergency Management Agency and the Federal Emergency Management Agency have collaborated with local governments and communities statewide to ensure that Mississippi rebuilds stronger and safer.

“FEMA was there to assist the state of Mississippi days before the storm made landfall and this partnership remains strong today,” said Robert Latham, Executive Director of the Mississippi Emergency Management Agency.  “They have continued to support with financial and technical assistance to help rebuild Mississippi and make us more resilient.”   

The following is a snapshot of FEMA and state assistance provided throughout the state during the last ten years:

Helping individuals and families:

More than $1.3 billion was spent to help individuals and families meet their basic needs and begin to recover. More than 126,000 families received rental assistance – with more than 45,000 families provided with a temporary housing unit.

Rebuilding Mississippi’s Infrastructure:

MEMA administers FEMA’s Public Assistance funds. To date, FEMA has obligated over $3.2 billion – the amount committed to restore schools, public buildings, roads and bridges, medical facilities, parks and other infrastructure and for debris removal and emergency response during and after the storm.  

The current water and sewer infrastructure project underway in the City of Biloxi is the largest Public Assistance project in Mississippi following Hurricane Katrina. FEMA obligated over $363 million for this project.                             

Historic preservation

In an innovative agreement to preserve historic properties after a disaster, FEMA partnered with several agencies to streamline the process required by the National Historic Preservation Act. Under this agreement – called the Secondary Programmatic Agreement – FEMA’s historic and archaeological specialists used GPS data to survey thousands of historic properties, districts and archaeological sites in the lower Mississippi counties most affected by Katrina. This survey is nearly 94 percent complete.

FEMA has worked with the state of Mississippi to safeguard these treasures in our Public Assistance and Hazard Mitigation work through extensive environmental/historical assessments and collaborative decisions.

Preparing for future disasters

FEMA has obligated $314 million for Hazard Mitigation in federal funds for safe rooms, shelters, hurricane-proofing and other projects to reduce the effect of future disasters. This is part of the $364 million available to Mississippi for projects to reduce the impact of disasters on people and property. The balance of the remaining funds to be obligated is just over $50 million. To date, $159 million has been obligated for safe rooms across the state. MEMA manages the Hazard Mitigation Grant Program in Mississippi. It identifies projects and manages them from beginning to closeout.

As we reach the ten year mark and the Hurricane Katrina recovery mission is nearly completed, Mississippi’s new and rebuilt infrastructure will be less vulnerable to future storms than in 2005. “Our strong partnership with the state of Mississippi was the key part in making our recovery efforts a reality for Mississippians,” said FEMA Mississippi Recovery Office Acting Director Laura Hill. “FEMA is proud of having worked with Mississippi in our rebuilding efforts to make the state stronger and better prepared.”

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Maintaining enterprise security only gets more difficult, as additional means of cyberattack and increasingly sophisticated techniques are added to attackers’ arsenal.

“Our personal and professional attack surfaces have never been greater, and they are only expected to grow as organizations and individuals continue to increase their reliance on the digitally connected world for a variety of tasks,” explained researchers from network infrastructure and security services company Verisign. “Security practitioners must not only protect their enterprise assets, but also guard against threats to their supply chain and other business ecosystems. These threats, coupled with the cyber threat landscape’s continuous evolution in terms or actors, tactics and motivations, have created a situation where organizations must now move toward an intelligence-driven, holistic security approach to keep pace with the rapid changes in attackers’ tactics, techniques and procedures (TTPs).”

According to Verisign’s “2015 Cyber Threats and Trends: What You Need to Know to Protect Your Data,” the top cyberrisks from 2014 and the first half of 2015 came from:



Some MSPs view the cloud as children regard a dark place – it’s scary. But scary as it may seem, every MSP needs some kind of cloud strategy.

To be fair, plenty of MSPs have successfully added cloud services and, in the process, reinvented themselves as well-rounded managed IT service providers. They realized that adding cloud solutions is a logical and necessary step.

“I see cloud as a delivery mechanism for managed services,” says MSPAlliance CEO Charles Weaver. As an MSP, he points out, you don’t have to change billing practices or radically alter your business model when adding cloud offerings. Rather, you are adding services that complement what you already do.



Monday, 17 August 2015 00:00

Are We Heading for Storage Armageddon?

Many of us remember those old computer messages that no more storage space was available on the hard drive. The user had no choice but to offload some data or spend some hours going through the files finding material that could be deleted. Enterprise users will also recall the fondness some storage administrators had for sending out those “You have exceeded your storage quota” notices. Fortunately, the size of today’s hard disk drives (HDDs) seems to have brought about a virtual disappearance of such unwelcome communications.

But what about the storage world as a whole? We are merrily merely packing information onto digital storage from every conceivable angle. Mobile phones, big data, the cloud, tablets, the internet of things (IoT), analytics and more are gobbling up available storage capacity at an alarming rate.

Could we reach the point where there is simply nowhere left to put all this data? In other words, could we arrive at Storage Armageddon or Stor-mageddon?



Hacktivists, cyber criminals and other threat factors are radically changing the way enterprises handle information security.

No longer just an IT issue, security is an urgent, strategic business concern. Customers are worried and looking for answers – and that means new opportunities for partners to sell Citrix networking solutions including NetScaler and CloudBridge.

The rapid growth of mobile and cloud environments, and their unique security issues, increase the revenue potential of these solutions.



Cloud service providers have a lot to celebrate – since 2011, adoption of software as a service has more than quintupled from 13 percent to 72 percent in 2014. The growth of services such as cloud-based file sharing is driven by startups for which the cloud is the great equalizer – allowing startups to use tools and applications usually restricted to companies with deeper pockets.

In fact, a survey done by Rackspace on the economic impact of the cloud found that a quarter of medium- and small-scale businesses surveyed experienced an increase in profits from 25 percent to 75 percent by moving to the cloud. 84 percent of businesses were also able to increase their investment back into the company by 50 percent.

While these are no doubt fantastic numbers, there are still many out there that are concerned about the security of the data in transit. Despite the fact that most clouds have very robust security features, many companies will be hesitant to part with (the locality of) their data. You, as an MSP offering cloud-based file sharing need to be prepared to clear up any doubts that companies might have. Next time you are meeting up with a prospective client, consider asking them the following questions:



Better than any report on the federal government’s “critical skills gap,” the cybertheft of 22 million federal personnel records demonstrates Uncle Sam’s need for cyber experts.

They did, but too late. Most of them were already committed to private industry.

That illustrates one reason cybersecurity, or more accurately cyber-insecurity as shown by the Office of Personnel Management data breach, remains on the Government Accountability Office’s 2015 high-risk list. “Although steps have been taken to close critical skills gaps in the cybersecurity area,” GAO says, “it remains an ongoing problem and additional efforts are needed to address this issue government-wide.”



Monday, 17 August 2015 00:00

Cybersecurity: Fix It or Die?

Two of the largest hacking conferences, Black Hat and DEF CON, highlighted some of the scariest vulnerabilities in cyberattacks today. From hacking a Wi-Fi connected rifle, a Tesla electric car, a Brinks safe and an electric skateboard, there seemed no end to the demonstrations of what a hacker can do.

From unlocking cars and opening garages to hacking a satellite, the breach demonstrations made a clear point about cyberattacks: They are very real and can be very dangerous.

Although content database hacking is still of concern, as seen shown by the Pentagon's recent hacking of nonclassified emails, there seems to be a more dangerous and lethal capability now being demonstrated in our increasingly device-connected world. Gartner projects 25 billion connected vehicles will be in use by 2020, and a recent HP study shows that more than 70 percent of Internet of Things (IoT) devices have vulnerabilities that can be exploited.



What's a lifeline service? In the telecom industry, we used to say landline voice was such a service, but that's certainly no longer the case. Mobile or broadband Internet? To many people, those services seem like lifelines.

What about electricity, nuclear power, other forms of energy like oil and gas? Or transportation systems -- highways, railways and airline networks? And don't forget public safety -- everything from the local first responders to national homeland security and border management. There's little argument that all of the above are lifeline services as much as any telecom service is.

Yet, despite the extreme importance of these services, some of the world's critical infrastructure for enabling these lifeline services could be at risk for potentially devastating cyber security attacks. We aren't necessarily talking about hacker schemes targeting the IT systems of the companies operating this infrastructure the way Target and Sony have suffered embarrassing breaches.



Friday, 14 August 2015 00:00

Wear and tear of SSDs

Unlike traditional hard drives, data in SSDs are not stored on a magnetic surface but inside flash memory chips (NAND flash). By design, an SSD is made by a motherboard, few memory chips (depending on the size in GB of the drive) and a controller that controls all the operations.

The memory of SSDs is a non-volatile memory, in other words, it’s able to retain data even without power. We can imagine the data stored in the NAND flash chips as an electric charge preserved in each cell. With that in mind, the question arises: how long is the lifespan of an SSD?



Have you thought about where your data is at greatest risk? If you haven’t, you should, because where that risk lies may surprise you.

Bromium, a global enterprise security company, asked Black Hat attendees about the state of security, querying them about security trends, the security of Windows 10, and where to find the source of the worst security risks. The answer to that last question wasn’t the network or the cloud. Fifty-five percent of the 100 respondents said endpoints are the security risk they are most concerned about (compared to 27 percent who cited insider threats and 9 percent for both the cloud and the network).

What makes endpoints such a security risk? According to the survey, “humans are just one element that makes the endpoint the source of the greatest security risk. Another major factor is vulnerable software.”



Scope creep can be disastrous to a managed service provider delivering cloud-based file sharing -- and is one of the major reasons why a service level agreement is so imperative. Scope creep can occur either due to internal or external drivers, but either way it is almost always detrimental to the system as a whole.

Understanding how scope creep happens and how you can manage it can help you keep your cloud projects on track and under control while still keeping your clients happy.



AUSTIN, Texas – Texans will have the opportunity to assist with the state’s disaster recovery from the severe storms, tornadoes, and flooding that occurred from May 4 to June 22. Dozens of qualified Texans will be offered temporary jobs as local hires of the Federal Emergency Management Agency (FEMA) in its Austin, Denton, and Houston offices.

FEMA has partnered in this venture with the Texas Workforce Commission (TWC). Those interested may go to http://www.workintexas.com and create an account. Once logged in, click on “Search All Jobs” and type “FEMA” into the search bar.

Currently, there are several job categories posted:

  • Customer service
  • Logistics
  • Switchboard/Help desk
  • Project Specialist
  • Technical/Architecture/Engineering
  • Environment Restoration/Anthropologists/Biology/Historic Preservation

FEMA positions with detailed job descriptions will remain posted until the jobs are filled.

Candidates must be 18 years of age or older and must be a U.S. citizen. Qualified applications will be forwarded to FEMA staff, who will select candidates for interviews. Selected candidates should have a valid government identification card, such as a driver’s license or military ID. Candidates will be required to complete a background investigation, which includes finger printing, and additional ID, such as Social Security card, birth certificate or passport. The hiring process may take up to 15 days from the date of application.

FEMA is committed to employing a highly qualified workforce that reflects the diversity of our nation. All applicants will receive consideration without regard to race, color, national origin, sex, age, political affiliation, non-disqualifying physical handicap, sexual orientation, and any other non-merit factor. The federal government is an Equal Opportunity Employer.

More positions may be posted on the TWC webpage as the disaster recovery continues.

All are encouraged to visit https://www.fema.gov/disaster/4223 for news and information about this disaster.


All FEMA disaster assistance will be provided without discrimination on the grounds of race, color, sex (including sexual harassment), religion, national origin, age, disability, limited English proficiency, economic status, or retaliation. If you believe your civil rights are being violated, call 800-621-3362 or 800-462-7585(TTY/TDD).

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling 800-659-2955, emailing This email address is being protected from spambots. You need JavaScript enabled to view it., or visiting SBA’s website at   www.sba.gov/disaster. Deaf and hard-of-hearing individuals may call 800-877-8339.

FEMA’s temporary housing assistance and grants for childcare, medical, dental expenses and/or funeral expenses do not require individuals to apply for an SBA loan. However, those who receive SBA loan applications must submit them to SBA to be eligible for assistance that covers personal property, transportation, vehicle repair or replacement, and moving and storage expenses.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling 800-659-2955, emailing This email address is being protected from spambots. You need JavaScript enabled to view it., or visiting SBA’s website at   www.sba.gov/disaster. Deaf and hard-of-hearing individuals may call 800-877-8339.

FEMA’s temporary housing assistance and grants for childcare, medical, dental expenses and/or funeral expenses do not require individuals to apply for an SBA loan. However, those who receive SBA loan applications must submit them to SBA to be eligible for assistance that covers personal property, transportation, vehicle repair or replacement, and moving and storage expenses.

SAIPAN, CNMI – The government of the Commonwealth of the Northern Marianas (CNMI), the American Red Cross (ARC) and Federal Emergency Management Agency (FEMA) announced an expansion of enhanced resource assistance for Saipan residents affected by Typhoon Soudelor. Using FEMA supplies, the ARC, which has already provided assistance to some 3,000 residents, will make the aid available.

Residents already registered with the ARC and are considered to have the greatest need will be processed first, said the ARC’s Operations Director, Denise Everhart.

ARC has a list of more than 3,000 individuals who have called into the chapter looking for assistance and is calling those with greatest need to do one-on-one casework.  ARC will then supply Client assistance Cards with some money for fuel, phone, and laundry as well as tarps, water, buckets, hygiene supplies, food and other supplies.  This will be continued, until the list is complete.

If there are limiting factors, where people cannot get to the designated ARC Chapter, located at 1 Airport Road, please call the ARC at 670-234-3459, and the ARC will work with FEMA and CNMI to accommodate those individuals on a case-by-case basis.

“Working through the CNMI Government and the American Red Cross is the best and fastest way to get the basic life sustaining supplies into the hands of the residents of Saipan that need them most,” said FEMA’s Federal Coordinating Officer, Steve DeBlasio.

“These supplies will go a long way in allowing the government of CNMI and the rest of our federal and private sector partners to create solutions to problems.” DeBlasio also thanked the US Navy and the US Marine Corps for their assistance in supporting the humanitarian mission on Saipan, and praised the resilience and patience of the citizens of the CNMI.

Additional supplies to what FEMA already had on the ground in CNMI were transported from Guam on the USS Ashland last Saturday. That cargo contained generators of various sizes, as well as large amounts of food, drinking water, tents, and vital heavy equipment needed to move the larger generators. The USS Ashland is expected to off-load Guam Power Authority heavy equipment on Saipan this morning.This equipment will help to expedite the restoration of electrical infrastructure here in Saipan.

“The residents of Saipan have been resilient, patient, and extremely hospitable under very trying and austere circumstances,” DeBlasio said. “They deserve our thanks and our help.”

DeBlasio also encouraged disaster survivors on Saipan to continue to register for FEMA assistance by calling 1-800-621 FEMA (3362), adding that more than 2,500 households had already done so.

Last Updated: 
August 13, 2015 - 11:17
State/Tribal Government or Region: 
Friday, 14 August 2015 00:00

Regional Collaboration: Rural Style

Darrell Ruby is the regional coordinator for Washington State Homeland Security (HLS) Region 9 for Greater Spokane Emergency Management (GSEM) in Washington state. (This is not a Washington State Emergency Management Division position). Region 9 is composed of the 10 counties and three tribes of eastern Washington. His role is to support regional collaboration, coordination and an interagency approach to all-hazard emergency preparedness.

For more than 10 years, he has served GSEM in all phases of emergency management supporting planning, training, exercises, HLS grants and grant-related projects. He is a certified emergency manager, Incident Command System (ICS) trainer, has an undergraduate degree in construction science from Texas A&M University, a master’s degree in business, and remains active in the naval reserve as an explosive ordnance disposal officer. He responded to a series of questions about what makes a successful regional rural approach to emergency management.



Traditionally, disaster recovery has always been sold like “earthquake” insurance--like it’s only for natural disasters. The reality is that 75% of downtime is the result of human error--completely unrelated to natural disasters. This new reality puts businesses in a precarious position as they don’t have any way to mitigate the effects of system downtime.

Consequently, IT failures have increased in frequency, becoming more the status quo than the anomaly. And, unfortunately, these failures cost 80% of SMBs at least $20,000 per hour (source: IDC). That said, downtime costs vary significantly within industries, especially due to the different types of downtime. A failure of a critical application can lead to a few types of losses:



The cyber insurance market for small- to mid-sized companies is much friendlier than the market for larger insureds, according to the findings of an annual survey just released by Betterley Risk Consultants.

The Cyber/Privacy Insurance Market Survey 2015 notes that there are many insurance products competing for the business of small and mid-sized (SME) organizations.

Brokers are actively selling cyber policies to their SME insureds, and more are buying than ever before, as they realize the potential for liability, breach and response costs, arising out of the possession of private data.



If FDE and FLE sound like twins to you, you could be on the right track for a comprehensive approach to keeping your data confidential. Indeed, FDE (full disk encryption) and FLE (file level encryption) both have security advantages to offer on their own – and even more when they are used together. Conversely, this means that neither encryption approach replaces the other. In particular, FDE protects data at rest on a PC hard disk, for example, whereas FLE protects data in motion, as in files that are being transferred or copied to other systems. Both can benefit from paying attention to the following.



As if to reiterate recent reports on small to midsize business (SMB) security issues, CompTIA recently released findings from its latest report, which found that in the digital age, security has become a huge concern.

Not surprising is the fact that SMBs have recognized the need to find new, technology-based ways to reach their customer bases. Along with embracing new technology, though, have come troubling issues with a lack of budget and staff to back up the new tech with a strong security plan. As CompTIA’s VP of Member Communities, Jim Hamilton, told ARN, SMBs are seeking ways to gain technology and implement security on the cheap:

“Without an abundance of capital to invest in technology initiatives, many firms seek the best value or the lowest cost option. [They are] choosing to handle technology issues internally using employees who may be tech savvy but actually hold other jobs such as sales or accounting.”



Eighty-one percent of MSPs deliver some level of security to clients, according to CompTIA. But how good are MSPs at addressing the human factor?

IBM estimates human error contributes to at least 95 percent of security incidents, while Verizon has concluded mistakes by internal staff, especially administrators, were “prime actors” in more than 60 percent of incidents. While most of those insider threats result from negligence rather than malice, the outcome is the same – a vulnerable IT environment.



Doing Business in a Big Data World

Big Data — Will it supercharge the economy, and revolutionize how companies compete? Will it tyrannize us all, as governments and businesses track and anticipate our every move? Or is it all just hype?

In Digital Exhaust, leading digital expert Dale Neef cuts through the breathless enthusiasm and dystopic sci-fi visions, placing Big Data in a realistic context that reflects the larger technological and economic processes that are changing our world.

Neef explains how Big Data works, what can be done with it, and what it all really means. Neef shows how an emerging Big Data-intelligence complex is innovating at a pace that is increasingly difficult to absorb or regulate. Then he assesses the implications: not just for civil liberties and personal privacy, but for businesses, the economy, law, and even geopolitics.



Thursday, 13 August 2015 00:00

Changes to the Core of the Enterprise

“The times they are a changing” would make a good theme for the enterprise these days. In virtually every aspect, organizations across the board are transforming into digital entities and are rapidly discovering the challenges and opportunities that this change represents.

Some may argue that the enterprise has always been changing, from the introduction of the first mainframe to the cloud, but by and large this was a change to enterprise technology. The hardware and software changed, but these were almost always aimed at improving traditional processes and workflows that had existed in their basic forms for decades.

The difference today is that technological change is producing fundamental, functional change in the enterprise and driving an entirely new economic model in the process.



(TNS) - The automated communication system designed to "quickly and directly send messages" to Naperville, Ill., residents and keep them informed during emergencies apparently failed Monday morning, during a police manhunt for two robbery suspects.

The Naper Notify Mass Notification System "may have malfunctioned" and failed to alert at least some residents to the robbery, and the hours-long police search of their neighborhoods, Naperville police Cmdr. Lou Cammiso confirmed late Monday afternoon. Police "did not know that at the time, and are trying to determine what the malfunction was," Cammiso said in an email.

Police set up a perimeter and began a house-to-house search following a Monday morning robbery. Two men fled from the apparent getaway car police were trying to stop on Thornapple Drive just south of Aurora Avenue, a block or two east of the police station.



(TNS) -- The National Science Foundation has awarded Oklahoma State University and three partner universities $6 million to develop an integrated unmanned aircraft system to improve weather forecasting through the study of atmospheric physics.

The four-year grant will support the collaboration of researchers from OSU and the universities of Oklahoma, Nebraska and Kentucky.

The project’s goal is the development of small, affordable unmanned systems to be used by government and university scientists and private companies to expand the understanding of atmospheric conditions and improve weather forecasting.



Everyone wants to do more with less. In the data center, this means increasing the data load while reducing hardware, infrastructure, management and power consumption.

While most of these items are achievable with virtualization and automation, the power equation is a bit trickier, if only because most people outside the industry fail to appreciate the connection between the data services they demand and the energy it takes to provide them. Even if systems are more efficient, at the end of the day, the data center industry is still consuming steadily more power.

Admittedly, part of this is due to lack of participation from the data center industry. As a recent survey from IDC pointed out, the bread-and-butter enterprise still has not jumped on the energy efficiency bandwagon like the web-scale industry has. Simple economics plays a big part in this equation: Large-scale facilities have to drive efficiency to new levels lest their energy budgets crash the entire business model. As well, standard enterprises often have lower utilization rates in order to protect critical apps and services, whereas large cloud providers are more adept at shifting loads should key components go dark.



As IT organizations begin to routinely collect massive amounts of data, deciding who inside the organization should have access to that information is becoming a thorny issue. Business analysts often want to compare and contrast random sets of data in the hopes of discovering new patterns and insights regardless of the sensitivity of the data. This often puts them at loggerheads with IT organizations that have long been responsible for overseeing data management.

To reduce that friction, Paxata, a provider of a data preparation platform that runs on top of Apache Spark clusters, has added two-factor governance tools to the Paxata Summer '15 release of its adaptive data preparation platform, which provides data administrators with control over all functional permissions, such as who can perform what types of tasks, while resource permissions over who has access to data sets and projects can be set by analysts.



(TNS) - Local emergency managers agree that a recent New Yorker article was more than a little over the top in implying that everything west of Interstate 5 will be “toast” after the next Cascadia megaquake hits the Pacific Northwest.

While the region’s infrastructure will be in tatters, most newer buildings should ride out the shaking fairly well. But engineers and civic leaders have known for decades that one type of structure will, indeed, be reduced to rubble: old brick buildings.

From corner groceries to churches, offices and multistory apartment blocks, thousands of these seismic death traps are scattered through neighborhoods in Seattle, Tacoma, Portland and other Northwest cities.

Yet little has been done to require owners to retrofit or even warn occupants that the walls around them are likely to collapse in a major quake.



Rogue drone operators are rapidly becoming a national nuisance, invading sensitive airspace and private property — with the regulators of the nation’s skies largely powerless to stop them.

In recent days, drones have smuggled drugs into an Ohio prison, smashed against a Cincinnati skyscraper, impeded efforts to fight wildfires in California and nearly collided with three airliners over New York City.

Earlier this summer, a runaway two-pound drone struck a woman at a gay pride parade in Seattle, knocking her unconscious. In Albuquerque, a drone buzzed into a crowd at an outdoor festival, injuring a bystander. In Tampa, a drone reportedly stalked a woman outside a downtown bar before crashing into her car.



(TNS) -- As the camera attached to its underbelly snapped pictures, the drone glided a few hundred feet above the quiet, tree-lined suburban streets of North Coventry Township.

It was tracing the path of a killer, investigators say.

Chester County, Pa., prosecutors are hoping the images captured by the unmanned device, driven by four propellers and weighing less than a half-gallon of milk, will help prove that a man arrested last month carefully planned his fatal attack on a rival who was involved with his ex-girlfriend.

As an alternative to costly helicopter reconnaissance flights, the county says, the craft that it bought last fall for $1,800 is saving taxpayers thousands of dollars.

Drones such as this one are becoming ever more popular across the nation for investigative and other purposes, with industry officials projecting that 20,000 will be purchased annually by public-safety agencies by 2025.

They also have stoked privacy concerns.



Geary Sikich looks at the subject of collateral risk and shows how the concept can be used in risk management processes.


The Law Dictionary defines collateral risk as:

The risk of loss arising from errors in the nature, quantity, pricing, or characteristics of collateral securing a transaction with credit risk.  Institutions that actively accept and deliver collateral and are unable to manage the process accurately are susceptible to loss.  A subcategory of process risk.

The military defines collateral risk in terms of ‘risk to mission’ as depicted in figure one below:



This paper by Jim Burtles, Hon. FBCI, is an attempt to bring a simple but effective and comprehensive approach to the development and delivery of business continuity solutions. It is the third article in a series where we are publishing the short listed entries in the Continuity Central Business Continuity Paper of the Year competition.

Some forty years of experience have led me to the conclusion that it is important to have a broad understanding of what we are trying to achieve right from the start of any business continuity development program. A broad understanding does not require a detailed set of objectives, pre-determined procedures or specific deliverables; such a cumbersome short-sighted approach usually leads to a solution which appears to meet the prescribed parameters rather than one that solves the actual problem or provides adequate protection. I suggest that we should try to base our approach upon a generic, but comprehensive, model that shows which areas should be considered and covered by our plans and procedures.

Whenever we are trying to develop our ideas and understanding of any practical subject it is more reliable and effective to work from a basic concept which we can visualise and remember. Simple pictures and basic shapes are more powerful starting points than strings of words which can soon lose their meaning and relevance as the project moves forward and the detail begins to reveal itself. For this reason I have based my hypothesis upon a hexagon, a simple six-sided figure which is easy to remember and visualise.

Business continuity is a relatively modern management discipline, derived in the 1980s from disaster recovery which only began in the mid-1970s. Consequently, it is still evolving and refining its language, concepts and techniques in order to match an ever-changing business environment. This steady advance requires, and includes, the definition and refinement of a generally accepted code of good practice together with an agreed terminology which can be used to form the basis of relevant standards, regulations and guidelines. We are slowly acquiring a common body of knowledge, experience and information which supports the ongoing development and expansion that is happening within a number of disparate and often unconnected schools of thought.



Like a medical examination, the result of penetration testing to assess your organisation’s IT security is technically only valid at the moment it is performed. Independently of how thorough such ‘pen tests’ are, the context in which they are performed changes on a frequent basis. IT hardware and software vendors release new versions and patches of firmware, operating systems and applications. Hackers invent new attack vectors. Employees come and go, and business partners and suppliers, with whom you collaborate and share information, change too. If the business and IT environment fluctuates so much, why then is it still important to do penetration testing?



Tuesday, 11 August 2015 00:00

Even Security Companies Get Breached

We depend on security companies for several things. First and foremost, to provide the software and tools that help keep our own networks and data secure. Second, to be the front line of the latest security issues; while we may only know some companies by their AV software, most are also involved in research and detection of new vulnerabilities and malware. Third, we expect them to be the shining example of how good security is done.

So what happens when the security companies are the victim?

In July, the announcement came that Bitdefender suffered a data breach, in which a small number of customer usernames and passwords were compromised. According to eSecurity Planet, the breach was caused by human error and outdated software. The article also pointed out that those responsible for the hack are using blackmail – wanting a ransom for the customer data or it gets released (which was done a day or two after the threat).



(TNS) - Each day, freight trains slice through the center of the city at a swift 70 miles an hour, carrying industry goods eastward and westward. With the explosion of heavy train traffic stemming from the Permian Basin oil boom, the threat of rail-related accidents looms larger.

Earlier this summer, a freight train slipped off its tracks in Odessa. Ten rail cars carrying hydraulic fracturing sand derailed and fell sideways along the track. About a week ago, Midland County Fire Marshal Dale Little saw the derailed cars still belly-up, causing him to ask the critical question: “What if that had been oil or a chemical?”

Throughout the past 10 years, 109 hazmat-carrying train cars have been involved in accidents, according to data rail lines report to the Federal Railroad Administration. In that same period, five instances of derailments have been reported inside Ector County. To the east, in Howard County, 54 cars carrying hazardous materials have been involved in accidents with six instances of derailments in the last decade, according to federal data.



Machine learning is all about algorithms. It’s been used to spot fraud by the financial industry and is supposed to predict behaviors of users.

So how does machine learning intersect with IT security?

“Machine learning is the technology that underpins analytics in security,” says Travis Greene, Identity Solutions strategist at NetIQ, the security portfolio of Micro Focus. “Analytics is the distillation of data or statistics (in this case, security events) into meaningful information that is used for better decision making.”

Analytics, Greene goes on to say, is differentiated from reports, which are typically a graphical representation of data without an identification of trends, abnormalities, predictions or scoring, which analytics provides.



With all of the high-profile data breaches occurring across the spectrum of industries over the last few years, enterprises are no longer in the dark as to the dangers lurking in the digital world. However, awareness of the problem is not the same as prevention. For managed service providers (MSPs), preventing security breaches in cloud data storage and cloud-based file sharing may mean collaborating for a better understanding of how to keep ahead of  the hackers.

In nearly every industry, there have been laws passed or regulations put in place that act to help organizations to keep their sensitive data and information safe from unwanted eyes. But, is it enough? One could argue that these laws and regulations only provide the explicit, transcribed details for what malicious parties will be up against. For many hackers, it’s as if their homework has been done for them.



(TNS) - On Aug. 29, it will have been a decade.

A decade since Katrina hit New Orleans.

Ten years have passed. Although it seems much longer.

And no time at all. The waters have receded. The refrigerators and their fetid contents have been carted away. The dead have been buried.

The mold has been conquered. For the most part. New Orleans is back. Better than ever.

But Katrina is still a presence here. The one watermark that cannot be scrubbed away. The line that divides New Orleans into pre- and post-hurricane. Post-Katrina is different. For the locals. For the visitors, too.



Tuesday, 11 August 2015 00:00

Is the Hard Disk Past Its Prime?

People have been calling for an end to the spinning disk drive for quite a while, but is it possible that it could actually come to pass over the next few years?

The history of technology innovation argues against it. Rarely does something new come along that pushes the old way into extinction – broadcast TV is still with us, as are trains, and the postal system still delivers the odd hand-written letter.

But when it comes to the disk drive, it is getting harder and harder to make a case for its continued deployment in either enterprise or consumer settings, even for long-term, archival applications. It’s not just that solid state solutions are faster and more flexible than disk, but they are becoming increasingly suited to the types of modular infrastructure that is currently finding favor in both the cloud and the enterprise, not to mention being more conducive to the distributed resource and application environments that are driving the new economy.



Tuesday, 11 August 2015 00:00

The Study of Risk and Uncertainty

I’ve been reviewing any number of books on risk, trying to find texts that resonate for teaching enterprise risk management to undergraduates this fall.  The fact of the matter is that most books on risk are written either by academics whose primary expertise is in some ancillary area of study, whose authors often lack experience in having made risk-related decisions in the organizational environment; or by analysts from the financial sector, who focus primarily on financial or compliance risk.  As a result, there is usually no broader context into which the practice of risk management is placed.  I want more for my students, so they will be reading from more than one book, then reading also a range of white papers and articles.

I will be using excerpts from Peter L. Bernstein’s non-textbook, called Against the Gods: The Remarkable Story of Risk because he tells the story of risk and probability theory through the centuries with clarity and color.  “The revolutionary idea that defines the boundary between modern times and the past is the mastery of risk: the notion that the future is more than a whim of the gods and that men and women are not passive before nature.  Until human beings discovered a way across that boundary, the future was a mirror of the past or the murky domain of oracles and soothsayers who held a monopoly over knowledge of anticipated events.” (Introduction, p.1)



The first payments are being made to policyholders taking part in the Federal Emergency Management Agency’s (FEMA) Hurricane Sandy Claims Review, the agency announced today.

The payments represent additional funds owed to National Flood Insurance Program (NFIP) policyholders who filed flood insurance claims after Hurricane Sandy in 2012.

“We want to ensure our policyholders are paid what they are owed under their policies. This claims review gives us a chance to take another look,” said Roy Wright, Deputy Associate Administrator for FEMA’s Federal Insurance and Mitigation Administration. “I encourage policyholders to request a review if they believe their Hurricane Sandy claim was underpaid for any reason.”

In May 2015, FEMA began contacting 142,000 NFIP policyholders who filed claims resulting from Hurricane Sandy, offering to review their claim files. To date, more than 10,000 policyholders have entered the process. FEMA authorized the insurance companies writing NFIP policies to make the first additional payments to policyholders whose claims have been reviewed through this process.

The deadline to request a review is Sept. 15, 2015. After initial request, the entire process usually takes around 90 days to complete.

To be eligible for the review, policyholders must have experienced flood damage between Oct. 27, 2012 and Nov. 6, 2012 as a result of Hurricane Sandy. Policyholders may call the NFIP’s Hurricane Sandy claims center at 866-337-4262 to request a review. Before contacting the claims center, policyholders are asked to have their flood insurance carrier name and policy number at hand.

Alternately, policyholders can go online to download a form requesting a review. The downloaded form may be filled out and emailed to This email address is being protected from spambots. You need JavaScript enabled to view it. to start the review process. 

For individuals who are deaf, hard of hearing, or have a speech disability and use 711 or VRS, please call 866-337-4262.  For individuals using a TTY, please call 800-462-7585 to begin the review process.

The Sandy claims review process is designed to be simple for the policyholder, and does not require paid legal assistance. Several nonprofit service providers are ready to offer free advice and answer questions policyholders may have. A list of these advocacy groups can be found on the claims review web page.

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Monday, 10 August 2015 00:00

Shadow IT: Worse Than We Thought

The enterprise has many reasons to convert legacy infrastructure into a private cloud: lower costs, greater flexibility and scalability. These are all perfectly valid reasons, but it seems that a key driver is the rise of shadow IT.

According to new research from Cisco, shadow IT (and I’m still waiting for someone to print the obvious acronym here) is much worse than previously suspected. A recent survey of data users showed that the use of unauthorized applications is 15 to 20 times higher than what many CIOs believe. On average, the report states, IT departments estimate their companies utilize about 50 cloud services while in fact the number is 730. And the discrepancy between reality and perception is growing quickly: One year ago, it was 7x, within six months it had jumped to 10x. At this rate, the number of shadow apps could top 1,000 for the average enterprise by the end of the year.



(TNS) — An active shooter in a school — it’s everyone’s worst nightmare.

And, lately, FBI statistics show, it has been happening more often. A generic scenario goes something like this:

The shooter enters the building through the front doors and moves through the lobby. He — because FBI data show it’s almost always a he — is looking for targets. In the lobby he fires off a couple of shots.

Panic ensues.

People run for cover, and teachers work to follow practiced safety drills. Everyone is screaming and many are calling 911.

Meanwhile, the shooter moves to another floor, or another building, looking for more people to terrorize.

More often than not — in 67 percent of cases — the incident is over before police are able to engage the shooter.

That means every minute matters.



(TNS) - The likelihood of a below-average hurricane season is even higher than what forecasters first predicted in May, but whether the sluggish forecast and Florida’s 10-year lull signals a permanent shift in weather patterns is doubtful.

An updated storm forecast released Thursday by NOAA’s Climate Prediction Center upped the chances of a quieter than normal 2015 Atlantic hurricane season to 90 percent — the highest confidence level given since seasonal hurricane outlooks were first issued in 1998 and an increase from the previous 70 percent prediction.

The new report, which was timed to coincide with the beginning of the more active storm months of August through October, also reduced the number of storms expected this season. Overall, six to 10 named storms are forecast, down from May’s 6 to 11.



Monday, 10 August 2015 00:00

Top Ten Ways to Use OpenStack for Storage

In the first article in this series, we explained what OpenStack is. Now we are following it up with some tips for those thinking about implementing it for specific tasks. Here are ten popular ways to use this open source cloud computing software for storage purposes:



Monday, 10 August 2015 00:00

BCI: The value of standards

The value of standards

There have been many standards produced that support the work we do – ISO22301 on business continuity management, BS65000 on organizational resilience and ISO22317 on business impact analysis. But what are these standards for, what do they achieve and what value can we place on them?

BSI note, in their latest report, The economic contribution of standards to the UK economy, that "the development of standards is driven by a demand from industry" and "help to solve fundamental process, organisational and technical problems, which if left unresolved, could result in inefficient market functioning and poor economic outcomes." Or, to put it another way, they make our organizations more streamlined and therefore more efficient.

But can you actually put a value on this? Yes, according to the report which was based on independent research conducted by the Centre for Economics and Business Research (Cebr), whose analysis found a positive and significant contribution of standards to productivity – supporting 37.4% of annual labour productivity growth in the UK economy over the period 1921 to 2013, which translates into approximately 28.4% of annual GDP growth. To put an actual monetary figure on that, standardization at a national level would be associated with approximately £8.2 billion of the £29.0 billion of GDP growth recorded in 2013 (2014 prices).

That's quite an extraordinary finding, so if your business continuity management programme is not currently aligned to ISO22301 then you might want to consider doing so. And where better to start than reading through the Business Continuity Institute's Good Practice Guidelines that are fully aligned to the standard.

The report does note that standards do not boost productivity growth exclusively. Instead standards have a symbiotic and complementary role in driving productivity along with other factors such as improvements to education and advancements in technology. Standards support productivity growth through a variety of mechanisms such as by enhancing organisational efficiency, boosting trade and facilitating innovation.

The survey also highlighted the existing capacity of businesses to become more involved in the standards development process. Over two-thirds (68%) of businesses surveyed were not involved in the standards development process, yet the evidence showed that participating in developing standards makes it more likely that a company experiences benefits from using standards. Those who reported they are highly involved in the standards development process are the most likely to report that they experience a net benefit from standards.

Members of the BCI play an important role in standards development with several of them on the relevant committees. You don't need to be on a committee to play a role however, you can also get involved by providing feedback when standards are in development and 'out for comment'. If you would like to get involved with this, then keep an eye out on our website and social media channels or by subscribing to our newsletter.

Monday, 10 August 2015 00:00

Your digital shadow

Many people move about daily through the Internet in a completely natural and uninhibited manner and make intensive use of the many opportunities that the World Wide Web has to offer. Nowadays it is almost always available thanks to the ever faster evolving technical possibilities and the number of devices that constantly communicate with the Internet is also ever increasing.

But it is not only devices but also such activities as have become standard use, which make the digital universe explode: emails, SMS, video files, MP3 music file downloads, online banking, cloud computing and, last but not least, interactivity through social media, all of them ensure that the amount of information that each individual consumes and produces, rapidly increases.



As one of the U.S.’s largest processors of payments, Heartland Payment Systems is teaming up with emaginePOS to offer a cloud-based POS system integrated with Heartland Secure payment processing for small to midsize businesses (SMBs).

Through its Heartland Commerce venture, Heartland is helping SMBs strengthen their business operations by providing highly secure credit and debit card payment processing solutions. By adding the emaginePOS system, a whole new world of POS security and flexibility is opened up for SMBs. The emaginePOS cloud-based system “runs on virtually any hardware platform,” which will allow restaurants and retailers to integrate the system with their current touchscreen workstations, iPads and other tablets.

As one of the big players in the payments industry, Heartland offers many perks to software developers with whom they do business. As Heartland CEO Robert Carr explained in a statement:



I know a number of people who wear fitness trackers or other wearable devices because their employers’ health insurance either offers incentives for doing so or, in at least one case, requires they wear them.

In any case, wearables and apps that monitor our health have entered the workplace. Perhaps your company is one of those that use such a device or app. If so, what is being done to protect the data transmitted via these devices and apps? While this may seem more like an HR issue for now, the management of wearable devices needs to transfer to the IT and security departments (if it hasn’t already). The reason is simple: Employees are worried about security and privacy. According to a survey by Healthline, nearly half of respondents (45 percent) expressed serious concern about hackers gaining access to their medical information, and that concern is spread over a variety of devices. 



By Ben J. Carnevale

Cloud computing remains a strong topic of interest for organizations big and small. And, as with many topics and developing technologies concerned with use of the internet, risk management and cyber-security preparedness teams struggle to keep up with the terminology and risk mitigation strategies needed in order to make cloud technologies work successfully and effectively for your organization.

To help that process along, our staff has recommended adding a recent article dealing with “cloud computing terms you need to know” to your organization’s information security and preparedness reading library.

Friday, 07 August 2015 00:00

3 Step Cloud Onboarding Plan

The cloud can be intimidating for companies. Many business leaders and even IT professionals don’t know what impact cloud-based file sharing will have on their business, what the risks are and whether it will add another layer of complexity to their organization.

When they do decide to adopt the cloud, many businesses are ready to jump in head first, but that doesn’t mean you should let them. It also doesn’t mean their initial concerns have been completely washed away.

Before any cloud implementation, MSPs should prepare their client, and themselves, for the project ahead. To do so, there are a few things you need to educate your client on, a few things you need to ask your client and a few things you should both agree on before getting started.



(TNS) - A consolidated 911 center in Richmond County is one step closer to becoming a reality.

County Manager Rick Sago announced Tuesday night that the county was awarded a $6.3 million grant to build the center.

Richmond — one of three counties awarded — received the lion’s share of the $9.9 million available this year from the N.C. 911 Board. Graham and Hyde counties also received grants.

The funding for the grants comes from the 911 surcharge assessed on wireless phones, said Richmond County Director of Emergency Management Donna Wright.

The new center will consolidate the current Richmond County 911 center with the dispatch desks of the Richmond County Sheriff’s Office and Rockingham and Hamlet police departments.



With nearly 28 years of experience with the Lexington Division of Police and 15 months as director of Lexington Enhanced 911, I’ve seen the realities of next-generation public safety communications — what it can be and what it should be.

You can’t go 60 seconds in a conversation about public safety communications without someone using the word “interoperability.” Plus, the number of interpretations — and misinterpretations — of what it actually means is directly related to the number of participants in the conversation. That’s because “interoperability” means something different to the industry’s many facets.

One commonality, however, is that regardless of how the term is used, interoperability is vital to realizing the true potential of next-generation public safety communications and how we can better protect lives. But in order to realize that potential, everyone who has even a cursory stake in public safety operations should be aware of the breadth and impact of interoperability in each of its expressions, chief among them network and component considerations.



Thursday, 06 August 2015 00:00

The Future Economy of Continuity

The future is automation, business processes automated, IT systems automated but where does that leave us humans in the equation of the automated world? And will there be sufficient job positions to counteract the imbalance of jobs moving to an automated cycle? The answer lies within our economy and heads of organisations. For a business to progress and effectively make their margins and profits every year involves many factors and the biggest factor is the expense of paying employees.



I ran into an interesting article in the Harvard Business Review this week that points to what may be a huge mistake management and IT are regularly making: Holding IT responsible for data quality. The author, Thomas C. Redman, wrote back in 2012 that you need to get responsibility for data the heck out of the IT department and put it someplace where the authority exists to assure the result. You see, line organizations collect and use the data, are far closer to the source, and have a far better understanding of what it means and how it is going to be used. This means that line organizations should own the responsibility for the data they use because they are generally closer to it, understand it more deeply, and will be the most impacted by the data quality.

Line also typically owns the budget to fund any data acquisition and analytics effort and thus is more likely to fund the effort fully. It appears that large companies and IT organizations often make the most foolish of management decisions, having the people that are responsible for something not have any real authority over it.



Thursday, 06 August 2015 00:00

Big Infrastructure, then Big Data

Big Data is turning into a big driver of enterprise infrastructure deployment, but this begs the question: Since so little is known about Big Data and how it can be used, can we make any firm decisions about how to support it with existing technology?

According to a recent study by market tracking firm SteelBrick, 72 percent of high-tech providers are reporting increased sales volumes due to Big Data, and more than 40 percent report accelerating sales cycles, in some cases from more than a year to as low as three months. This means that not only is more product moving off the shelf, but also buyers are upgrading legacy systems at a faster pace. The results spanned virtually the entire enterprise data spectrum, from basic infrastructure to cloud computing and software-as-a-service. If these trends continue, expect demand to soon outstrip supply, says SteelBrick CEO Godard Abel, which inevitably leads to product shortages and rising costs.



Thinking about the day your business is destroyed from a natural disaster is about as fun as thinking about cleaning up the Christmas tree needles come Valentines day when you finally decide to take the tree down. However, like life insurance, it’s something important to think about, and plan for, or you could end up in a lot of trouble.

In the infographic below, we break down common disasters that can happen to a business, their potential costs, and give some great ideas on how to to plan for them.

A recent article in the New Yorker magazine about the Cascadia Earthquake threat received a great amount of attention in the popular press. Multiple news organizations have profiled the story and sought to bring it home convincingly to their audiences. The question is: Will anything really change?

Six months or a year from now will the building codes be revised? Will landlords owning unreinforced masonry buildings — those that are most likely to pancake in an earthquake — be required to retrofit these buildings to address this public safety issue? Will more than a smattering of individuals or families have taken action to become more personally prepared?

As someone who has worked on earthquake and disaster planning for years, I think not — for a variety of reasons. Any reasonable person might make the assumption that given all this geological history that is well documented will motivate people and organizations to change their behaviors.



I continue my exploration of the use of social media in your Foreign Corrupt Practices Act (FCPA) compliance program today. One of the ways that Chief Compliance Officers (CCOs) and compliance practitioners can communicate about their compliance programs is through the use of the social media tool Twitter. In an article in the Summer issue of the MIT Sloan Management Review, entitled “How Twitter Users Can Generate Better Ideas”, authors Salvatore Parise, Eoin Whelan and Steve Todd postulated that “New research suggests that employees with a diverse Twitter network – one that exposes them to people and ideas they don’t already know – tend to generate better ideas.” Their research led them to three interesting findings: (1) “Overall, employees who used Twitter had better ideas than those who didn’t.”; (2) In particular, there was a link between the amount of diversity in employees’ “Twitter networks and the quality of their ideas.”; and (3) Twitter users who combined idea scouting and idea connecting were the most innovative.

I do not think the first point is too controversial or even insightful as it simply confirms that persons who tend have greater curiosity tend to be more innovative. The logic is fairly straightforward, as the authors note, “Good ideas emerge when new information received is combined with what a person already knows.” In today’s digitally connected world, the amount of information in almost any area is significant. What the authors were able to conclude is that through the use of Twitter, “the potential for accessing a divergent set of ideas is greater.”



Wednesday, 05 August 2015 00:00

Instill an Appetite Cognizant of Risk

The time has come for a firm to find a solution to financial perils, as shown by the frequency and severity of the recent financial disasters.  Wouldn’t it be nice if a firm has a system that “physiologically or automatically” predisposes its stakeholders to respond coherently and timely before it is trapped into a financial hole?  This paper describes the methodology for creating such a system.

The Need for Risk Appetite

If a firm lets the feelings or emotions of its decision makers decide how much risk they should take, they would likely miss the mark.  Yet, rogue traders doubled their bets on the way down, institutions took risks that exceeded their capital, and firms failed due insufficient liquidity.  Often, these debacles are caused by the institutions’ lack of clearly defined risk appetites, or failure to adhere to them.



Heavy June rains, high July nutrient runoff levels likely cause for increased size


Map showeing distribution of bottom-water dissolved oxygen from July 28 to August 3, west of the Mississippi River delta. Black lined areas — areas in red to deep red — have very little dissolved oxygen. (Data: Nancy Rabalais, LUMCON; R Eugene Turner, LSU. Credit: NOAA)

Map showing distribution of bottom-water dissolved oxygen from July 28 to August 3, west of the Mississippi River delta. Black lined areas — areas in red to deep red — have very little dissolved oxygen. (Data: Nancy Rabalais, LUMCON; R Eugene Turner, LSU. Credit: NOAA)

Scientists have found this year’s Gulf of Mexico dead zone — an area of low to no oxygen that can kill fish and marine life — is, at 6,474 square miles, above average in size and larger than forecast by NOAA in June. The larger than expected forecast was caused by heavy June rains throughout the Mississippi River watershed.

The measured size this year — an area about the size of Connecticut and Rhode Island combined — is larger than the 5,052 square miles measured last year, indicating that nutrients from the Mississippi River watershed are continuing to affect the nation’s coastal resources and habitats in the Gulf. The size is larger than the Gulf of Mexico / Mississippi River Watershed Nutrient Task Force (Hypoxia Task Force) target of 1,900 square miles.

“Dead zones,” also called hypoxia areas, are caused by nutrient runoff from agricultural and other human activities in the watershed and are highly affected by river discharge and nitrogen loads. These nutrients stimulate an overgrowth of algae that sinks, decomposes, and consumes the oxygen needed to support life in the Gulf. Dead zones are a major water quality issue with an estimated total of more than 550 occurring annually worldwide. The Gulf of Mexico dead zone is the second largest human-caused hypoxic area in the world.

“An average area was expected because the Mississippi River discharge levels and associated nutrient data from May indicated an average delivery of nutrients during this critical month which stimulates the fuel for the mid-summer dead zone,” said Nancy Rabalais, Ph.D. executive director of the Louisiana Universities Marine Consortium (LUMCON), who led the July 28 to Aug 3 survey cruise. A suite of NOAA-sponsored models forecasted a range of 4,633 to 5,985 square miles based on May nitrogen loading data provided by USGS. “Since the models are based largely on the May nitrogen loads from the Mississippi River, the heavy rains that came in June with additional nitrogen and even higher river discharges in July are the possible explanations for the larger size,” said Rabalais.

Funded by NOAA and the EPA, the annual measurement mapping of the dead zone provides a critical scientific record of the trend of hypoxia in the Gulf, as well as the primary measure of progress used by the Hypoxia Task Force to determine whether efforts to reduce nutrient loading upstream in the Mississippi River Basin are yielding results. This year marks the 30th annual ship-based sampling that is the backbone of the mapping effort.

“The importance of having continued and sustained coastal observations are foundational in helping us better understand the size and impacts of the Gulf dead zone. This information ultimately informs the best strategies to reduce the size and the impacts of the dead zone, which will help improve the sustainability and productivity of our coastal economy,” said Holly Bamford, Ph.D., assistant NOAA administrator for the National Ocean Service performing the duties of the assistant secretary of commerce for conservation and management.

“The annual ship-based sampling is the backbone of the mapping effort,” said Diane Altsman, chief of staff of the EPA Gulf of Mexico Program. “It is important for us to partner with NOAA on supporting the cruise this year to ensure that the Gulf of Mexico Hypoxia Task Force has the critical information needed to assess their progress in mitigating hypoxia, part of our effort to restore the Gulf coastal ecosystem.”

The largest previous Gulf of Mexico dead zone was in 2002, encompassing 8,497 square miles. The smallest recorded dead zone measured 15 square miles in 1988. The average size of the dead zone over the past five years has been about 5,500 square miles, nearly three times the 1,900 square mile goal set by the Hypoxia Task Force in 2001 and reaffirmed in 2008.

The hypoxic zone off the coast of Louisiana and Texas forms each summer threatening the ecosystem that supports valuable commercial and recreational Gulf fisheries. NOAA-funded research in the past decade shows hypoxia results in habitat loss, displacement of fish (including shrimp and croaker) from their preferred areas, and a decline in reproductive ability in some species.

Visit the Gulf Hypoxia web site for additional graphics and information concerning this summer’s LUMCON research cruise, and previous cruises.

NOAA’s National Ocean Service has been funding monitoring and research for the dead zone in the Gulf of Mexico since 1985 and currently oversees the NGOMEX program, the hypoxia research effort for the northern Gulf which is authorized by the Harmful Algal Bloom and Hypoxia Research and Control Act.

The National Centers for Coastal Ocean Science is the coastal science office for NOAA’s National Ocean Service.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.

Wednesday, 05 August 2015 00:00

The Rise of Malvertising

LAS VEGAS — One of the hottest topics in cyberthreat detection right now is the rise of malvertising, online advertising with hidden malware that is distributed through legitimate ad networks and websites. On Monday, Yahoo! acknowledged that one of these attacks had been abusing their ad network since July 28—potentially the biggest single attacks, given the site’s 6.9 billion monthly visits, security software firm Malwarebytes reported.

In the first half of this year the number of malvertisements has jumped 260% compared to the same period in 2014, according a new study released at the Black Hat USA conference here today by enterprise digital footprint security company RiskIQ. The sheer number of unique malvertisements has climbed 60% year over year.

“The major increase we have seen in the number of malvertisements over the past 48 months confirms that digital ads have become the preferred method for distributing malware,” said James Pleger, RiskIQ’s director of research. “There are a number of reasons for this development, including the fact that malvertisements are difficult to detect and take down since they are delivered through ad networks and are not resident on websites. They also allow attackers to exploit the powerful profiling capabilities of these networks to precisely target specific populations of users.”



The resilience challenge for the business continuity profession

Since its inception, the goal of the Business Continuity Institute has been to promote a more resilient world, and with so much attention being placed on resiliency in recent years, never has this goal been more pertinent. To help introduce the paradigm shift to resilience that we are currently experiencing, the BCI 20/20 Think Tank (UK Group) has published a new white paper that draws on recent ideas relating to the discipline which demonstrate the vital role that business continuity has in advancing its concept and prepare BCI members for entering the next stage of progression under its umbrella.

The resilience challenge for the business continuity profession positions BC as an integral part of resilience, the benefit of which can be felt in a variety of ways throughout an organization. It directly impacts on operational decision making and problem solving, allowing leaders to respond in a manner consistent with strategic intent. It also enables organizations to increase adaptive capacity, maximise competitive advantage and become more agile to changes in the business environment.

The paper notes that building resilience goes beyond BC and requires substantial input from other protective disciplines. This represents a real opportunity for BCI members to advance professionally with high-level thinking and a fundamental understanding of risks which may prepare them to undertake future roles at the top level of their organizations. Thinking about resilience and aligning one’s actions to strategic goals will also strengthen the accountability of professionals in the protective disciplines as well as an organization’s top management.

Bill Crichton, Membership director of the BCI and Chair of the 20/20 UK Group, noted: “It is recognised by many in the wider resilience community that both the individuals within it, and the professional bodies representing them, will need to grow their relevant skills and develop closer links with the other related disciplines. This first published white paper from the BCI’s UK 20/20 Group; The resilience challenge for the BC profession, is aimed at both promoting the changes needed to move the profession forward and challenging us as practitioners to develop and enhance the additional skills required to meet to achieve the resilient future.

The paper concludes that BCI members should realise the business environment has changed dramatically and it is necessary to adapt in order to stay relevant. The paradigm shift will likely demand a change in knowledge, skills and competencies of BCI members. In addition to technical know-how and the BC specialism, BCI members may increasingly be expected to exercise strategic communication skills as they work with other professionals. Ultimately, as the way to resilience heralds changes in business practice, it also uncovers opportunities. By accepting and responding to the changes, BCI members can profit from these opportunities.

Wednesday, 05 August 2015 00:00

BCM & DR: Decision Management

When you’re building a BCM / DR program, there are allot of decisions to be made along the way. Some come from results of a BIA or other information gathering session and some have to be made through feedback received from the sponsor based on a potential roadblock encountered. Regardless, decisions get made and when they do, you – as the BCM / DR practitioner – should document these decisions.

When documenting decisions, ensure you keep a consolidated tracking log that outlines;

  • What the decision is,
  • Who made the decision (and what meeting it was made in, if not captured in an email),
  • Date of the decision,
  • Why the decision was required (what sparked the need for a decision in the first place), and finally,
  • Give each decision a unique identifier (e.g. D-001, D-002 etc.).



There are plenty of compelling reasons to install a surveillance system in your office, but there are also a number of reasons not to. Cameras are becoming more and more common in our daily lives, and choosing whether or not to embrace them in your own workplace can be a challenging decision.

There are advantages and disadvantages to consider before installing cameras and phone/Internet monitoring. Here are a few of them:



Wednesday, 05 August 2015 00:00

Risk Management – Looking Forward 30 Years

Last month, we looked back 30 years and reported some of the powerful lessons learned from that period with respect to risk management (with a particular focus on the last 15 years). During the last three decades, we have seen risk management evolve to a more holistic view that portrays an enterprise risk profile to help management and directors understand the full array of risks facing the organization. Access to data necessary to better understand and manage risk has never been greater. Both internal and external data sources can be combined to create more insights than ever before. While the processes used to update risk profiles certainly help executives answer the question, “Are we riskier today than we were yesterday?”, progress has been curtailed by continued emphasis on fragmented silos, ineffective measurement and monitoring of risks, subordinating risk to an afterthought to strategy setting and positioning risk management as an appendage of performance management.

The good news is that today, largely because of the financial crisis, risk management has made its way onto the agendas of executive management and Boards of Directors as a critical discipline and a necessary part of good governance. This is a base upon which we can build as we go forward. The heightened level of importance at the highest levels of organizations will accelerate improvements in risk management in the future.



Today, FEMA’s National Integration Center (NIC) is soliciting public feedback for the update of the Federal Interagency Operational Plans (FIOPs).

This National Engagement Period began August 3, 2015 and will conclude at 5:00 pm EDT September 2, 2015. National Engagement provides an opportunity for interested parties to comment and provide feedback on the FIOPs.

Each FIOP outlines the concept of operations for integrating and synchronizing existing national-level Federal capabilities to support local, state, tribal, territorial, insular area, and Federal plans. The FIOPs are also designed to provide state, local, tribal, territorial, and insular area planners an understanding of how the Federal Government will utilize capabilities so that they may develop or modify plans accordingly. All FIOPs, except Prevention, are available to the whole community. The Prevention FIOP is Unclassified and For Official Use Only (FOUO)/Law Enforcement Sensitive (LES), Restricted Access and therefore available to appropriate personnel through separate and secure communication means.

FEMA will also host a number of National Engagement Webinars to provide stakeholders with additional details on the FIOPs update effort. The Webinars will take place August 17-26, 2015. Webinar details will be announced at a later date.

This update of the FIOPs focuses on discrete, critical content revisions, and conforming edits as a result of comments received on the National Preparedness Goal and National Planning Frameworks. Additional changes in the current draft of the FIOPs are the result of the lessons from implementing the Frameworks and recent events, as well as the findings of the National Preparedness Report.

To review the draft FIOPs, please visit http://www.fema.gov/learn-about-presidential-policy-directive-8. To provide comments, please complete the feedback form and submit to This email address is being protected from spambots. You need JavaScript enabled to view it..

Questions can be directed to FEMA’s NIC at: This email address is being protected from spambots. You need JavaScript enabled to view it..

For more information on national preparedness efforts, visit: http://www.fema.gov/national-preparedness.

FEMA is requesting stakeholder feedback on working drafts of four of the five Federal Interagency Operational Plans (FIOPs):  Protection, Mitigation, Response, and Recovery. The Prevention FIOP is Unclassified and For Official Use Only (FOUO)/Law Enforcement Sensitive (LES), Restricted Access and therefore available to appropriate personnel through separate and secure communication means. The FIOPs describe how the Federal government aligns resources and delivers core capabilities. Each FIOP outlines the concept of operations for integrating and synchronizing existing national-level Federal capabilities to support the whole community.

This update of the FIOPs focuses on discrete, critical content revisions, and confirming edits as a result of comments received on the National Preparedness Goal and National Planning Frameworks. Additional changes in the draft are the result of the lessons learned from implementing the FIOPs and recent events, as well as the findings of the National Preparedness Report.  The FIOPs and feedback submission forms may be found at http://www.fema.gov/ppd-8-news-updates-announcements">http://www.fema.gov/ppd-8-news-updates-announcements.

To ensure all feedback is properly handled, reviewers are asked to use the provided feedback submission form to submit feedback and recommendations. Please provide any comments and recommendations, using the submission form, toThis email address is being protected from spambots. You need JavaScript enabled to view it. byTuesday, September 2, 2015 at 5:00 PM EDT.

If you have any questions, please contact FEMA’s Private Sector Division at(202) 646-2600 or at This email address is being protected from spambots. You need JavaScript enabled to view it.. Follow FEMA online at http://www.fema.gov/blog">www.fema.gov/blog, http://www.twitter.com/fema">www.twitter.com/fema, http://www.facebook.com/fema">www.facebook.com/fema and http://www.youtube.com/fema">www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at http://www.twitter.com/craigatfema">www.twitter.com/craigatfema. The social media links provided are for reference only.  FEMA does not endorse any non-government websites, companies or applications. 

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Cloud adoption and cloud-based file sharing are becoming increasingly popular among the general public and private use of cloud services within organizations is causing concern among CIOs. Unfortunately, IT organizations are having a hard time keeping up. According to an article from Business Cloud News, a recent survey conducted by Fruition Partners of 100 UK CIOs found that 84 percent believe cloud adoption reduces their organization’s control over IT.

However, it isn’t the cloud itself that is causing organizations to feel a lack of control. The cause of most CIOs anxiety comes from Shadow IT.



The hackers responsible for the Anthem and U.S. Office of Personnel Management (OPM) data breaches recently may have attacked United Airlines as well.

And as a result, United tops this week's list of IT security news makers to watch, followed by the University of Connecticut (UConn), Franciscan St. Francis Health and the HAMMERTOSS malware.

What can managed service providers (MSPs) and their customers learn from these IT security news makers? Check out this week's list of IT security stories to watch to find out:



Technology is not enough in the fight against cybercrime, effective cybersecurity measures require policy and process changes as well.

That’s the takeaway from an analysis of cyber-risk spending included in the 2015 U.S. State of Cybercrime Survey recently released by PwC.

While cybersecurity budgets are on the rise, companies are mostly reliant on technology solutions to fend off digital adversaries and manage risks.

Among the 500 U.S. executives, security experts and others from public and private sectors responding to the survey, almost half (47 percent) said adding new technologies is a spending priority, higher than all other options.



For those MSPs contemplating the build-versus-buy question with regards to offering backup and disaster recovery (DR) as a service, be careful when it comes to the purchase and management of storage. Get it wrong and you could end up with a money pit.

A useful analogy is the home. Suppose a couple is looking at whether to buy a house or build their own dream house. The latter option would require buying a parcel of land, working out the plans, obtaining the necessary city permits and going to Home Depot repeatedly for an endless list of materials. With the basic elements on site, now comes the hard part. Digging the trenches, cutting the steel rebar to erect the framework in which to pour the concrete, then adding the walls, doors, windows, plumbing, electrical and many more details--any one of which could trip up the home owners and add time to the project.

Like the distraught home buyers who end up looking like they are in a remake of Tom Hanks’ “Money Pit” movie, many such projects run way over budget and are delayed by many months, if not years. Only if the homeowner has a broad do-it-yourself (DIY) skillset, or has generous contractor friends, does this method have any possibility of success.



NORTH LITTLE ROCK – Federal assistance may be available to help Arkansas communities rebuild infrastructure to higher, more disaster-resistant standards and state officials are encouraging local governments to take advantage of that funding.

The assistance to communities is part of the aid that became available following the severe storms, tornadoes, straight-line winds, and flooding during the period of May 7 to June 15, 2015.

“Generally, the federal Public Assistance program restores disaster damaged infrastructure to pre-disaster conditions,” said Nancy M. Casper, federal coordinating officer for the Federal Emergency Management Agency. “But when cost effective and technically feasible, it makes sense to rebuild to higher standards that can prevent future loss.”

FEMA’s Public Assistance program provides federal funds to reimburse a minimum of 75 percent of the costs for removing debris, conducting emergency protective measures and repairing levees, roads, bridges, public utilities, water control facilities, public buildings and parks. Mitigation funding may be considered in each project category.

Eligible applicants may include:

  • state agencies

  • local and county governments

  • private nonprofit organizations that own or operate facilities that provide essential government-type services

"Studies show that every $1 paid toward mitigation saves an average of $4 in future disaster-related costs,” said State Coordinating Officer Scott Bass of the Arkansas Department of Emergency Management Agency. "By adding mitigation money to repair costs, our goal is to reduce or eliminate damages from future disasters.”

As part of the process for applying for federal assistance, experts from ADEM and FEMA help identify projects that will qualify for the special mitigation program. Officials urge applicants to take advantage of the funds.

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners, and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling (800) 659-2955, emailing This email address is being protected from spambots. You need JavaScript enabled to view it.  or visiting SBA’s website at www.sba.gov/disaster. Deaf and hard-of-hearing individuals may call (800) 877-8339.

Is London prepared for climate change?

London’s businesses are ill-prepared for climate change risks as 54% of FTSE 100 firms have no business adaptation strategy in place for climate change. Evidence suggests that 60% of small and medium sized businesses have no plan in place to deal with extreme weather conditions.

The UK capital’s status as a global city makes its economy increasingly vulnerable to climate change, not only facing extreme weather like flooding, drought, heatwaves in the city itself, but also imported risks through the insurance sector, overseas investments and international supply chains. This is according to the new ‘Weathering the Storm’ report by the London Assembly Economy Committee which looks into the impact of climate change on London’s economy in terms of risks and opportunities.

Jenny Jones AM, the report author and former Chair of the Economy Committee, said: “Too little is being done to understand and prepare for the potential costs of climate change. London faces a great unknown when it comes to how our supply chains and economy will be hit by extreme weather events. For example, the damage from the 2011 floods in Thailand, where IT component parts are made, meant much higher prices across the global IT industry, including in London. A much worse situation would be if too many harvests failed and affected our food supply.

It’s no secret that the field of emergency management is not overly diverse. The typical emergency manager is an older white male. This lack of diversity is rooted primarily in the profession’s evolution. Many of the first emergency managers came from police, fire or first responder backgrounds, which for a long time were largely white, male-dominated fields in most parts of the country.

“Most emergency managers traditionally came from a pretty narrow slice of the professional world,” said Joe Partridge, disaster recovery business continuity manager for CareOregon, a nonprofit involved in health plan services, reforms and innovations. “Even as recently as the late 1990s, emergency management director positions were almost always located within a police or fire department and typically staffed by either a retired or close-to-retired person from a first responder background — typically 55 years old or older and a white male.”

Carmen Merlo, director of the Portland Bureau of Emergency Management in Oregon, has been working in emergency management for 18 years. “It’s often the case that I’m the only female in the room,” she said. “I still go to conferences where literally all of the panelists are white men.”



Mike McConnell is a former director of the National Security Agency and director of national intelligence. Michael Chertoff is a former homeland security secretary and is executive chairman of the Chertoff Group, a security and risk management advisory firm with clients in the technology sector. William Lynn is a former deputy defense secretary and is chief executive of Finmeccanica North America and DRS Technologies.

More than three years ago, as former national security officials, we penned an op-ed to raise awareness among the public, the business community and Congress of the serious threat to the nation’s well-being posed by the massive theft of intellectual property, technology and business information by the Chinese government through cyberexploitation. Today, we write again to raise the level of thinking and debate about ubiquitous encryption to protect information from exploitation.

In the wake of global controversy over government surveillance, a number of U.S. technology companies have developed and are offering their users what we call ubiquitous encryption — that is, end-to-end encryption of data with only the sender and intended recipient possessing decryption keys. With this technology, the plain text of messages is inaccessible to the companies offering the products or services as well as to the government, even with lawfully authorized access for public safety or law enforcement purposes.



Too few businesses testing their business continuity plans

Most midsize businesses have business continuity plans but few have tested them, according to The Hartford’s survey of midsize business owners and C-level executives in the US. This shortcoming presents potential risk for businesses, which may be unable to meet client needs due to an interruption in their operation or lose revenue due to a supplier issue.

The Midsize Business Monitor showed that the majority of midsize businesses surveyed (59%) had a formal, documented business continuity plan, one-third (33%) had an informal, verbal plan, and 8% reported having no plan at all. While this may be considered encouraging, what was damning was that only 19% of businesses had actually tested their plan.

The theme for Business Continuity Awareness Week 2015, run by the Business Continuity Institute was testing and exercising and one of the key themes that came out of the week was that a plan that has not been exercised is simply not a plan. You can only tell if a plan works when it is put to the test and it is far better to find out that it doesn’t work during an exercise rather than when the very existence of your business depends on it.

Weather-related events, fires, thefts and supplier interruptions are just a few of the issues that can impact a business,” said Eric Cannon, assistant vice president of property underwriting at The Hartford. “While many midsize businesses have taken the important step of developing a formal continuity plan, testing and updating that plan on a regular basis can mean the difference between a business’s ability to recover quickly versus being unable to meet client needs.

The Hartford survey found that more than one-third (36%) of midsize businesses had been unable to meet a client need due to an interruption in their operation, putting their relationship with that client at risk. While the majority managed to find an alternative supplier, nearly half (48%) lost business to other suppliers and 9% stated this loss was permanent.

Most midsize businesses surveyed (84%) rely on suppliers, vendors or consultants, yet four in 10 had suffered a supplier interruption and almost one-third (32%) had lost revenue due to a supplier problem.

Even the smallest vendor or that vendor’s supplier can impact a business’s ability to meet its customers’ needs. The savvy business owner must take the time to understand the continuity plans of its suppliers and their suppliers in order to fully know who is at the table and who can step in when back-ups are needed,” said Cannon. 

Is this what cyber war will look like?

Reports are saying that several major breaches, including Anthem, the U.S. government’s Office of Personnel Management (OPM) and United Airlines, which was just recently revealed, were all most likely conducted by the same Chinese cyberespionage group. All of the breaches involved the compromise of personally identifiable information (PII) of customers, employees and/or contractors, but as an eWeek article pointed out this could be a way for one government to spy or gain advantage over another government or country. Paul Kurtz, CEO of TruSTAR Technologies and a former White House cybersecurity advisor, told the publication:

We know that adversaries typically use a common command-and-control infrastructure to attack multiple companies across many sectors of the economy. Given what we've seen, it's not too shocking to learn about other breaches involving the same adversaries.



Kansas City, Mo. –The U.S. Department of Homeland Security’s Federal Emergency Management Agency’s (FEMA) Region VII office announced today there will be a routine biennial exercise conducted with Omaha Public Power District for the Fort Calhoun Nuclear Station in Nebraska on Aug. 4, 2015, followed by a public meeting.

Exercise participants will include: the states of Nebraska and Iowa; Washington County in Nebraska; Pottawattamie and Harrison counties in Iowa; and the Omaha Public Power District.

The routine exercise will test the abilities of the states of Nebraska and Iowa, the utility and the participating counties to protect the health and safety of the public living and working in the vicinity of the Fort Calhoun Nuclear Station.

The exercise is a biennial requirement to determine the adequacy of the state and local radiological emergency preparedness and response plans. It will require the activation of emergency facilities by the participating state and local officials. The activities of the state, county and local units of government will be observed and evaluated by the FEMA Region VII Radiological Emergency Preparedness (REP) Program. Fort Calhoun Nuclear Station on-site performance will be observed and evaluated by officials from the Nuclear Regulatory Commission (NRC).

On Thursday, August 6, 2015, a public meeting will be held to describe and explain the full-scale response exercise process. Since the process of evaluating the full-scale response exercise will take months, the preliminary findings and meeting discussion will be very limited in scope.

Members of the public and the media are invited to attend the meeting, starting at 11 a.m. (CDT) in the Fort Calhoun Volunteer Fire Station, located at 600 N. 14th Street, Fort Calhoun, Neb.

Representatives from FEMA Region VII will chair the meeting and explain the exercise process. A representative from the NRC Region IV office, located in Arlington, Texas, will discuss activities conducted on-site at the power plant during the exercise.

Follow FEMA online at www.twitter.com/fema, www.facebook.com/fema, and www.youtube.com/fema.  Find regional updates from FEMA Region VII at www.twitter.com/femaregion7. Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.  The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

OKLAHOMA CITY – To date, Oklahomans have received more than $40.7 million in grants, low-interest loans and insurance settlements from the federal government, helping to rebuild the lives of families and help out businesses affected by the severe weather and subsequent flooding during the period of May 5 through June 22.

Nearly 10,000 families have registered for assistance with the Oklahoma Department of Emergency Management, the Federal Emergency Management Agency and the U.S. Small Business Administration (SBA).

The disaster assistance, which totals more than $40.7 million, includes more than $15.5 million approved for homeowners and renters, more than $13.2 million in grants for housing, including home repairs and rental assistance, and more than $2.1 million for Other Needs, such as repair or replacement of personal property essential to the home. It also includes more than $8.6 million in payments to survivors through the National Flood Insurance Program and more than $16.7 million in SBA loans.

SBA has issued 1,342 applications for low-interest disaster loans to homeowners and businesses. More than $15.5 million has been approved for homeowners, and more than $1.2 million in loans has been approved for business owners rebuilding after the storms.

Low-interest SBA disaster loans may be available to businesses of all sizes as well as certain private nonprofit organizations. Homeowners and renters are also eligible for SBA loans for uninsured loss. These loans cannot duplicate benefits from other agencies or compensation from other organizations.

FEMA deployed 88 Disaster Survivor Assistance specialists going door to door in the affected 45 counties. To date, they have visited 18,878 homes and 889 community-based organizations delivering recovery information and guidance. These specialists have also registered 647 survivors for disaster assistance. A total of 4,206 people have visited DRCs.

Survivors may apply for state and federal assistance online with any computer, smartphone, or tablet at www.DisasterAssistance.gov or by calling 800-621-3362 or (TTY) 800-462-7585. Those who use 711-Relay or Video Relay Services can call 800-621-3362 to register. Hours to register by phone: 6 a.m. to 9 p.m. local time, seven days a week.

For more information on Oklahoma disaster recovery, click http://www.fema.gov/disaster/4222 or visit OEM at www.oem.ok.gov.

WASHINGTON – Today, the U.S. Department of Homeland Security's Federal Emergency Management Agency (FEMA) and Portlight Strategies (Portlight) announced an agreement that will increase preparedness awareness for people with disabilities in the event of natural or man-made disasters. The agreement aligns with FEMA’s commitment to inclusive emergency management by partnering with disability organizations and community leaders who serve the whole community at the local level.

“As we celebrate the 25th anniversary of the Americans with Disabilities Act, we are also reinforcing our commitment to serving the whole community before, during and after disasters,” said Craig Fugate, FEMA Administrator. “By having preparedness plans and thinking ahead, individuals, families and communities will be ready to respond to these events when they occur.”

The new partnership will bolster working relationships with state, local, tribal and territorial emergency managers to encourage including people with disabilities in planning.  It will also provide information so people understand the disaster risks in their area. By evaluating their own individual needs and making an emergency plan that fits those needs, people can be better prepared.

Some key highlights from the agreement show that FEMA and Portlight will:

  • Participate in training events and natural and simulation exercises, drills, and discussions focused on emergency preparedness and lessening the impact of disasters;
  • Share operational practices that work well and that may be adapted to make improvements in service delivery and support community resilience and accessibility for people with disabilities and others with access and functional needs; and
  • Share research-based emergency management data and information and training experience and expertise before, during, and after disasters.

"We're excited about this next important step in our relationship with FEMA and the ways it will enhance our ability to serve the disability community in times of disaster,” said Paul Timmons Jr., Portlight Co-founder and Board Chair. “It embodies our philosophy that there must be nothing about us without us.”

The primary mission of Portlight Strategies, Inc. (Portlight) is to provide disaster relief and recovery services specifically for people with disabilities and to facilitate accessible services—compliant with the Americans with Disabilities Act of 1990 (ADA)—from all providers, whether governmental or non-governmental.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Depending on who you talk or listen to, hyper-converged storage is either the future of storage, or it is a hype niche market that is not for everybody, particular not larger environments.

Admittedly, there is a lot of hype in and around convergence, including hyper-convergence. On the other hand, there is also a lot of reality in various converged infrastructure (CI), hyper-converged infrastructure (HCI), cluster in a box (CiB) and other solution bundle approaches.

Not every data center is the same; your data center will be different depending on whether you are a small office home office (SOHO), remote office branch office (ROBO) with a few servers, a departmental workgroup, small medium business (SMB), small medium enterprise (SME), large enterprise, web-scale or cloud services provider.



(TNS) - Nearly three years after Hurricane Sandy devastated New Jersey, its effects linger in the form of heightened anxiety and post-traumatic stress disorder, a report released Wednesday found.

More attention should be paid to the emotional consequences of housing damage, including mold, the report stated. Surprisingly, children who lived in homes with minor damage were even more likely than those in homes with major damage to feel sad or depressed or have trouble sleeping.

"We're definitely still hearing about the issues and the problems," said David Abramson, a New York University researcher who led the Sandy Child and Family Health Study.



The cloud environment you know today will be very different from the cloud environment you’ll see in a couple of years – just as it’s different from the one you saw a couple of years ago. As the cloud evolves, cloud security compliance protocols will evolve, too. As a managed service provider (MSP), it’s important to always be mindful of the latest codes of compliance for cloud data storage and cloud-based file sharing across any and all industries.

As lawmakers and governing bodies continue to gain an understanding of the impact that cloud computing has on the modern business community, the rules being put in place will become more stringent. They’ll also be revised and amended in an attempt to evolve with the cloud space.

The list of compliance regulations already in place includes PCI DSS (The Payment Card Industry Data Security Standard), SOX (The Sarbanes-Oxley Act of 2002), GLBA (The Gramm-Leach-Bliley Act), and HIPAA (The Health Insurance Portability and Accountability Act of 1996) – and that’s just to name a few. As noted by Paul Korzeniowski for CIO.com, this list will only grow longer.



After West Africa's 2014 Ebola epidemic magnified awareness about the deadly virus' effects -- and local response tactics -- Onslow County Health Department revisited its methods for prevention and management of communicable diseases.

"Ebola, it really took America by storm," said Pamela Brown, health department spokeswoman. "It really captured the public's imagination. It also gave us the opportunity to highlight the importance of public health. We are constantly preparing with our partners for just such a thing."

Ebola is a rare viral hemorrhagic fever that can spur severe headaches, fatigue, muscle pain, vomiting, diarrhea, abdominal pain or "unexplained hemorrhage," according to information from Centers for Disease Control and Prevention (CDC). The 2014 Ebola epidemic is the largest in history. "Two imported cases, including one death, and two locally acquired cases in health care workers were reported in the United States."



(TNS) - When a 7.8-magnitude earthquake hit Baguio on July 16, 1990, 5-year-old Klaridelle Reyes was sleeping on a couch. She woke up to a cacophony of voices and loud footsteps. She could hear people shouting, running to safety.

Kyle Yan, a 16-year-old student at Saint Louis University, was also napping on that cold afternoon when the quake struck. He awoke in the commotion and then waded through piles of books and personal belongings that had fallen to the floor during the first few seconds of the quake.

Outside, buildings were starting to crumble, landslides blocked roads and mines collapsed on hapless workers.



When the Rana Plaza building collapsed in Bangladesh, it wasn’t the physical disruption to the supply chain that caused the most damage to organizations at the top, it was the reputational damage as a result of the poor safety standards and human rights abuses taking place further down the chain. A disruption in one organization, whether physical or reputational, will have an impact throughout the entire supply chain.

Have organizations learnt their lesson from the incident above? The risk of organizations breaching international human rights regulations has risen significantly over the last quarter as key Asian economies adapt to tougher economic conditions. That is the conclusion of the latest Risk Index Report from BSI which it identifies China, India, Vietnam, Bangladesh and Myanmar as the five highest risk countries for human rights violations. These countries account for 48% of global apparel production, 53% of global apparel exports, and 26% of global electronics exports.

The Quarterly BSI Risk Index is based on intelligence from BSI’s Supply Chain Risk Exposure Evaluation Network (SCREEN) tool, which provides real-time incident reports for corporate social responsibility (CSR), security, and business continuity risk, threats, from over 20 proprietary risk categories across 200 countries. Supply Chain Intelligence from SCREEN identifies major CSR concerns, such as brand protection risks and changes to global regulation including the US legislation aimed at eliminating forced child labour, EU draft conflict minerals law, and the UK’s Modern Slavery Act. All of which relate directly to complex supply chains worldwide and can subject an organization to prosecution if their suppliers exploit human rights.

In addition to the legal repercussions, an organization’s brand reputation and consumer trust is compromised. The latest generation of consumers, millennials, are focused on buying from ethical and responsible businesses, highlighting the increased importance for organizations to adopt a supply chain risk management program and implement risk-based sourcing strategies. Understanding country-level threats provides the needed intelligence to filter risk to underpin a socially compliant and responsible supply chain.

The latest BSI Risk Index report warns that efforts by Asian governments to boost their economies are having the unintended consequence of allowing child labour abuses to become more present in supply chains. Also highlighted were proposed changes to labour laws that may incentivise firms to restructure as 'family enterprises', making it easier to employ underage workers in a country where 4.4 million children are already put to work.

Mike Bailey, EMEA Director of Professional Services at BSI, commented: “Organizations can no longer turn a blind eye to the actions of their suppliers. The laws we are seeing today may only apply to larger firms, but they set a benchmark for the industry and smaller organizations will be forced to comply to work with the larger companies, by default. Products assembled or services provided by child labour or depending on minerals from conflict zones have no place in the modern world.” 

Thursday, 30 July 2015 00:00

BCI: The cost of catastrophe

Less than a third (31%) of global economic losses as a result of natural disasters were covered by insurance (including both private insurers and government-sponsored programs) during the first half of 2015, according to a new study by Aon Benfield. This is slightly above the 10-year average of 27% because the majority of the losses occurred in regions with higher insurance penetration.

By contrast, around 2% of the multi-billion-dollar economic loss from the Nepal earthquake was covered by insurance. Statistics like this show how catastrophe models can play a role in helping the insurance industry to better understand these risks and seek ways to grow insurance penetration in underserved regions.

On a more positive note, losses during the first half of 2015, from both an economic and insured loss perspective, were each below the 10-year (2005-2014) average. Preliminary data from the Global Catastrophe Recap: First Half of 2015 report determined that economic losses were US$46 billion, down 58% from the 10-year average of US$107 billion, and insured losses were US$15 billion, down 47% from the 10-year average of US$28 billion.

The severe thunderstorm peril was the costliest disaster type, comprising 33% of the economic loss and 49% of the insured loss. Most of the costs were attributed to strong convective thunderstorm events that prompted widespread hail, damaging straight-line winds, tornadoes, and major flash flooding in the United States during the months of April, May and June.

A clear majority (73%) of the insured losses were sustained in the United States due to an active winter season combined with numerous spring severe convective storm events. Asia Pacific was second with 14% and Europe, Middle East & Africa was third with 11% of the insured loss.

Steve Bowen, associate director and meteorologist with Aon Benfield's Impact Forecasting team, said: "The first half of 2015 was the quietest on an economic and insured loss basis since 2006. Despite having some well-documented disaster events in the United States, Asia Pacific and Europe, it was a largely manageable initial six months of the year for governments and the insurance industry. Looking ahead to the rest of 2015, the continued strengthening of what could be the strongest El Nino in nearly two-decades is poised to have far-reaching impacts around the globe. How that translates to disaster losses remains to be seen, but something to keep a close eye on in the coming months."

Thursday, 30 July 2015 00:00

Early Warning On Heat Health Risk

As many parts of the United States enter another day of high heat and humidity, we’re reading about the first ever heatwave warning guidelines issued by the United Nations earlier this month.

The guidelines are intended to alert the general public, health services and government agencies via the development of so-called heatwave early warning systems that should ultimately lead to actions that reduce the effects of hot weather extremes on health.

As the foreword to the publication states:

Heatwaves are a dangerous natural hazard, and one that requires increased attention. They lack the spectacular and sudden violence of other hazards, such as tropical cyclones or flash floods, but the consequences can be severe.”



WASHINGTON – August 2015 marks the tenth year since the devastating 2005 Atlantic Hurricane Season.  According to the National Oceanic and Atmospheric Administration (NOAA), Hurricane Katrina was one of the strongest storms to impact the coast of the United States, causing widespread devastation and affecting an estimated 90,000 square miles along the central Gulf Coast states. Less than a month later, Hurricane Rita and then Hurricane Wilma in October made landfall compounding an already catastrophic situation.

Ten years into the recovery, FEMA continues to support communities and families, working side-by-side with state, local, and tribal partners to finish the job of rebuilding communities that are the economic engines and lifeblood of the Gulf Coast. To date, FEMA has provided $6.7 billion to more than one million individuals and households.  FEMA provided more than $131 billion to the states of Louisiana, Mississippi, Alabama, and Florida for public works projects in the aftermath of Hurricane Katrina to assist with recovery efforts.  

“Today, FEMA has the authority necessary to lean forward and leverage the entire emergency management team in response and recovery efforts,” said FEMA Administrator Craig Fugate.  “This team includes not only government but also the private sector, non-profits, and citizens themselves.  We support survivors and this holistic approach emphasizes the importance of working as a team to prevent, protect against, respond to, recover from, and mitigate all hazards.”

Since 2005, FEMA has significantly improved its ability to assist communities in responding to and recovering from disasters. With the support of Congress, FEMA was provided additional authorities and tools to become a more effective and efficient agency, one that is focused on putting survivors first.  Specifically, the Post-Katrina Emergency Management Reform Act (PKEMRA) of 2006, gave FEMA clear guidance on its mission and priorities, and provided the legislative authorities needed to better partner with state, local, tribal, and territorial governments before, during, and after disasters.  These improvements include:

  • Improved ability to provide support to states and tribes ahead of a disaster. Since 2005, FEMA gained statutory authority to surge resources to states, tribes, and territories ahead of a disaster should the capacity of states, tribes or territories become overwhelmed.  This authority expedites FEMA’s ability to respond to disasters if and when a state, tribe or territory requests support and a disaster is declared by the President. 
  • Development of a National Disaster Recovery Framework (NDRF). PKEMRA required FEMA, along with its partners, to develop a national disaster recovery strategy to guide recovery efforts after major disasters and emergencies. The NDRF clearly defines coordination structures, leadership roles and responsibilities, and guidance for federal agencies, state, local, territorial, and tribal governments, and other partners involved in disaster planning and recovery.
  • Establishment of Incident Management Assistance Teams.  These full time, rapid response teams are able to deploy within two hours and arrive at an incident within 12 hours to support the local incident commander. The teams support the initial establishment of a unified command and provide situational awareness for federal and state decision makers crucial to determining the level and type of immediate federal support that may be required.
  • Improved Search and Rescue capability.  Since 2005, FEMA has better integrated search and rescue assets from across diverse Federal agencies such as the U.S. Coast Guard and the Department of the Interior. 
  • Establish the Regional Emergency Communications Coordination Working Groups (RECCWGs) to serve as the primary focal points for interoperable communications coordination among federal, state, local, tribal and territorial emergency responders. The statute charges these RECCWGs with coordinating effective multi-jurisdictional and multi-agency emergency communications networks for use during disasters and emergencies.
  • Enhanced partnerships with the private sector. As part of this effort, FEMA established the National Business Emergency Operations Center that serves as a clearinghouse for two-way information sharing between public and private sector stakeholders in preparing for, responding to, recovering from, and mitigating disasters.
  • Support for the inclusion of people with access and functional needs. The Office of Disability Integration and Coordination was established to provide technical assistance and guidance for a wide range of emergency management activities, including equal access to emergency programs and services and meeting the access and functional needs of the whole community. This includes: preparedness, exercises, emergency alerting, accessible transportation and shelter accessibility guidance, assistive technology devices for accessible communication, accessible housing and grant guidance to states for accessibility, and partnership and stakeholder outreach.

For more information on FEMA’s continued work to support communities and families along the Gulf Coast, visit our Hurricane Katrina: A Decade of Progress through Partnerships website.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Few words spark more angst in business circles than “controls.” No one wants to be controlled, yet controls are an integral part to any business. Unfortunately, many people equate the word to a costly compliance exercise, largely thanks to the Sarbanes-Oxley Act of 2002 (SOX). This is not an article on SOX, but rather a look at how and why controls should be understood and appreciated by all organizations, regardless of type, industry or size. Defining and assessing controls is simply a sound business exercise regardless of regulatory compliance considerations.

However, before we leave SOX, there is a common question I want to address. Private companies and nonprofit organizations often inquire if SOX makes sense for them. First of all, let’s put this in perspective. SOX contains 66 sections within 11 titles covering a wide range of governance, audit, business, regulatory and enforcement topics. By far, the most common section is 404 entitled Management Assessment of Internal Controls. So for simplicity of addressing this question, I will approach it from this single section. Section 404 requires an annual management assessment of the effectiveness of the Internal Controls over Financial Reporting (ICFR), as well as an external audit opinion on ICFR for public companies reaching certain size thresholds. The answer is a definite “yes” regarding periodic management assessments, as this is simply a prudent business practice. As it pertains to additional attestation work, this is likely not warranted for most organizations. Instead, companies should ask their auditor to point out areas for control improvements as they obtain an understanding of ICFR for planning their audit of the financial statements. This independent feedback can be a valuable piece of the audit value proposition.



It seems that the prevailing wisdom in data center circles these days is that Big Data will simply be too big for the enterprise. When faced with the enormous volumes of sensor-driven and machine-to-machine (M2M) feedback, the enterprise will have no choice but to push the vast majority of the workload onto the cloud.

To be sure, the cloud offers a compelling value proposition when it comes to Big Data, but that does not mean that even small organizations won’t be able to build their own analytics infrastructure for the most crucial data.

The mistake that many executives make when contemplating Big Data is applying those volumes to infrastructure as it exists today. In reality, the infrastructure of tomorrow will be more compact, more scalable and more attuned to these emerging workloads than the legacy systems currently occupying the data center.



By Tyler M. Sharp  Ph.D. (LCDR,USPHS)

Most travelers to Africa know to protect themselves from malaria. But malaria is far from the only mosquito-borne disease in Africa. Recent studies have revealed that dengue, a disease that is well recognized in Asia and the Americas, may be commonly misdiagnosed as malaria in Africa. So if you’re traveling to Africa, in addition to taking anti-malarial medications you should also take steps to avoid dengue.

Map of areas around the world affected by Dengue.

Dengue is a mosquito-transmitted illness that is recognized as a common illness throughout Southeast Asia and much of the Americas. In fact, a study published in 2013 estimated that 390 million dengue virus infections occurred throughout the tropics in 2010. Although 70% of infections were predicted to have occurred in Southeast or Southcentral Asia, the next most affected region (16% of infections) was Africa, followed by the Americas (14% of infections). The large estimated burden of dengue in Africa came as a surprise to some, since dengue is not often recognized to be a risk in Africa.

Dengue is Hard to Diagnose in Africa

There are several reasons why dengue has limited recognition in Africa. First, the lack of laboratory-based diagnostic testing leads to many patients not being diagnosed with dengue. This can be perilous because without early diagnosis and appropriate clinical management, dengue patients are at increased risk for poor outcome. However, in order for a clinician to request dengue testing, they must first be aware of the risk for dengue. This awareness usually comes in the form of a positive diagnostic test result. Hence, without testing there is limited clinical awareness, and without clinical awareness there is limited testing.

Finding Dengue in Africa

Map of Africa

Brown indicates countries in which dengue has been reported in residents or returned travelers and where Aedes aegypti mosquitoes are present. Light brown indicates countries where only Ae. aegypti mosquitoes have been detected.

How do we know that there actually is dengue in Africa? First, since 1960 at least 15 countries in Africa had reported locally-acquired dengue cases. In addition, travelers returning home with dengue had been detected after visiting more than 30 African countries. Still more African countries are known to have the Aedes mosquitos that transmit the 4 dengue viruses. These findings together provide strong evidence that dengue is a risk in much of Africa.

Thus, it was not a surprise in the summer of 2013 when dengue outbreaks were detected in several sub-Saharan African countries. In many cases, detection of dengue was facilitated by the availability of rapid dengue diagnostic tests that enabled on-site testing.

Dengue Field Investigations in Angola and Kenya

In a past blog I described the initial findings of a dengue outbreak in Luanda, Angola, in west-central Africa outbreak: dengue cases were initially identified with a rapid diagnostic test and confirmatory diagnostic testing and molecular epidemiologic analysis performed as CDC demonstrated that the virus had actually been circulating in the region for at least 45 years. This provided strong evidence that dengue was endemic in the area. During the outbreak investigation, CDC and the Angola Ministry of Health conducted house-to-house surveys wherein blood specimens and questionnaires were collected. Of more than 400 participants, 10% had been recently infected.

Teams from the Angola Ministry of Health conduct a dengue serosurvey in Mombasa, Luanda. Image courtesy of the Angola Field Epidemiology Training Program.

Teams from the Angola Ministry of Health conduct a dengue serosurvey in Mombasa, Luanda. Image courtesy of the Angola Field Epidemiology Training Program.

Though nearly one-third reported recently dengue-like illness, and half had sought medical care, none of the patients with laboratory evidence of infection with dengue virus had been diagnosed with dengue, including one person who had symptoms consistent with severe dengue. Although this investigation yielded more questions than answers, it was clear that there was much more dengue in Luanda than was being recognized clinically. By improving clinical awareness through training of clinicians and strengthening disease surveillance, the ability for diagnosis of individuals ill with dengue or other emerging infectious diseases was improved.

On the opposite coast of Africa in Mombasa, Kenya, although dengue outbreaks had been reported for decades, the first outbreak to be confirmed with laboratory diagnostics occurred in the early 1980s. When an outbreak of non-malarial illness was reported in 2013, blood specimens were sent to a laboratory at Kenya Medical Research Institute (or KEMRI) to determine the cause of the outbreak. Three out of the four dengue viruses were detected during this outbreak, which alone suggested that dengue was endemic in the area. To get a better idea for how much dengue there was in Mombasa, CDC and the Kenya Ministry of Health conducted a representative survey in a populous neighborhood of Kenya. Over 9 days, 1,500 people were enrolled in the serosurvey and testing revealed that 13% of participants were currently or recently infected with a dengue virus. Nearly half of infected individuals reported a recent dengue-like illness, most of which had sought medical care.

Field workers from CDC and the Kenya Ministry of Health conduct a dengue serosurvey in Mombasa, Kenya. Image courtesy of Dr. Esther Ellis.

Field workers from CDC and the Kenya Ministry of Health conduct a dengue serosurvey in Mombasa, Kenya. Image courtesy of Dr. Esther Ellis.

However, nearly all patients had been diagnosed with malaria. Because Mombasa is a port city that is also popular tourist destination, not only was the apparent magnitude of the outbreak a concern for patient diagnosis and care in Mombasa, it also meant that visitors to Mombasa may not be aware of the risk of dengue and therefore could be getting sick and/or bringing the virus home with them.

What next?

There is not yet a vaccine to prevent infection or medication to treat dengue. Unlike the night-time biting mosquitoes that transmit malaria, the Aedes mosquitoes that spread dengue are day-time biters. Consequently, both residents of and travelers to Africa should protect themselves from mosquito bites to avoid dengue by using mosquito repellent. Other strategies, like staying in places with air conditioning and screens on windows and doors and wearing long sleeve shirts and pants, can also help whether you’re traveling to Africa or other regions of the tropics. For clinicians, if travelers recently returned from Africa with acute febrile illness, consider dengue as a potential cause of the patient’s illness.

We still have much to learn about dengue in Africa, but learning where there is risk of dengue is the first step to avoiding it.


Wednesday, 29 July 2015 00:00

Brain Design-Inspired Computing Is Here

Computing inspired by the design of brains is rapidly progressing. Very rapidly.

Companies like IBM and Qualcomm are financing neurochip projects, and in the case of IBM’s Cognitive Computing push, it may be betting its own future on neuromorphic technology. Europe is investing US $1.3 billion in the Human Brain Project, which sets out to simulate the human brain. Not to be left behind, the US announced in 2013 it is investing $300 million in its own Brain Initiative with similar objectives. Researchers in the UKCanada, at Stanford University, and at DARPA are all working on various aspects of the neuromorphic computing puzzle, and are now publishing their results.

Deep thinkers like Stephen Hawking and tech billionaires like Bill Gates and Elon Musk ominously warn about the impeding perils of this technology while proponents (including Paul Allen, also of Microsoft fame) fight backMany world scientists are dismayed over how the Human Brain Project is unfolding, fearing the project is quixotic and not transparent. They are now raising a ruckus. Philosophers continue to rail against the whole matter of intelligent machines, but this time not so safely detached since, with recent technical advances, the future is a lot closer now than it was in the last artificial intelligence (AI) go-around more than 25 years ago.



WASHINGTON — As part of the U.S. Department of Homeland Security’s (DHS) ongoing efforts to support state, local, tribal, and territorial partners, Secretary Jeh Johnson today announced final allocations for eight Fiscal Year 2015 DHS preparedness grant programs, including the Homeland Security Grant Program. These allocations total more than $1.6 billion to assist states, urban areas, tribal and territorial governments, non-profit agencies, and the private sector with their preparedness efforts.

Together with previous grant funding awarded since 2002, DHS has awarded over $40 billion to these partners. Preparedness grants strengthen our nation’s ability to prevent, protect against, mitigate, respond to, and recover from terrorist attacks, major disasters, and other emergencies in support of the National Preparedness Goal and the National Preparedness System.

The FY 2015 grants focus on the nation’s highest risk areas, including urban areas that continue to face the most significant threats. Consistent with previous grant guidance, dedicated funding is provided for law enforcement and terrorism prevention activities throughout the country to prepare for, prevent, and respond to crimes and other precursors or indicators of terrorist activity.

Preparedness Grant Program Allocations for Fiscal Year 2015:

Homeland Security Grant Program (HSGP)—provides more than $1 billion for states and urban areas to prevent, protect against, mitigate, respond to, and recover from acts of terrorism and other threats. 

  • State Homeland Security Program (SHSP)—provides $402 million to support the implementation of the National Preparedness System to build and strengthen preparedness capabilities at all levels.
  • Urban Areas Security Initiative (UASI)—provides $587 million to enhance regional preparedness and capabilities in 28 high-threat, high-density areas.
  • Operation Stonegarden (OPSG)—provides $55 million to enhance cooperation and coordination among local, tribal, territorial, state, and Federal law enforcement agencies to jointly enhance security along the United States land and water borders where there are ongoing Customs and Border Protection missions.

Awards made to the states and urban areas for HSGP carry pass-through requirements.  Pass through is defined as an obligation on the part of the State Administrative Agency (SAA) to make funds available to local units of government, combinations of local units, tribal governments, or other specific groups or organizations.  The SAA must obligate at least 80 percent of the funds awarded under SHSP and UASI to local or Tribal units of government.  

Per the Homeland Security Act of 2002, as amended, DHS/FEMA is required to ensure that at least 25 percent of grant funding appropriated for HSGP and the Tribal Homeland Security Grant Program are used for law enforcement terrorism prevention activities (LETPA).  DHS/FEMA ensures that this requirement is met in part, by requiring all SHSP and UASI recipients to ensure that at least 25 percent of the combined HSGP funds allocated under SHSP and UASI are dedicated towards LETPA. This 25 percent can be from SHSP, UASI, or both.  The 25 percent LETPA allocation is in addition to the 80 percent pass-through requirement to local units of government and Tribes.

Emergency Management Performance Grant (EMPG) Program—provides over $350 million to assist local, tribal, territorial, and state governments in enhancing and sustaining all-hazards emergency management capabilities. 

Tribal Homeland Security Grant Program (THSGP)—provides $10 million to eligible tribal nations to implement preparedness initiatives to help strengthen the nation against risk associated with potential terrorist attacks and other hazards.

Nonprofit Security Grant Program (NSGP)—provides $13 million to support target hardening and other physical security enhancements for nonprofit organizations that are at high risk of a terrorist attack and located within one of the 28 FY 2015 UASI-eligible urban areas.

Intercity Passenger Rail - Amtrak (IPR) Program—provides $10 million to protect critical surface transportation infrastructure and the traveling public from acts of terrorism and increase the resilience of the Amtrak rail system.

Port Security Grant Program (PSGP)—provides $100 million to help protect critical port infrastructure from terrorism, enhance maritime domain awareness, improve port-wide maritime security risk management, and maintain or reestablish maritime security mitigation protocols that support port recovery and resiliency capabilities.

Transit Security Grant Program (TSGP)—provides $87 million to owners and operators of transit systems to protect critical surface transportation and the traveling public from acts of terrorism and to increase the resilience of transit infrastructure.

Intercity Bus Security Grant Program (IBSGP)—provides $3 million to assist operators of fixed-route intercity and charter bus services within high-threat urban areas to protect bus systems and the traveling public from acts of terrorism, major disasters and other emergencies.

Further information on DHS’s preparedness grant programs is available at www.dhs.gov and http://www.fema.gov/grants.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

By and large, organizations tend to invest in preventative cybersecurity measures and they also concentrate their resources on detecting and stopping cyberattacks, rather than on painstaking “who did it?” investigations. They want to close the gap, manage the public opinion fallout, learn from the episode and move on.

From an enterprise perspective, this makes sense, as resources dealing with cybersecurity are usually overstretched and the organization does not stand to gain much from determining, with a certain degree of certainty, who was behind a cyberattack. The incentive equation, of course, is different if the target of the attack is a government or a large organization that is part of a country’s critical national infrastructure.

Attack attribution has traditionally been approached from the perspective of enabling the target or victim entity to pursue the attacker either for damages in a court of law; or from a national, military or intelligence “strike back” perspective.



Business And IT 

In today’s world, company operations function at two distinct levels: the business operation level and the IT infrastructure operation level. While the two functions operate independently, IT exists to support the business. Many of the IT operations, like the deployment and management of IT infrastructure, applications and services are driven by the business layer requirements in a top-down fashion to enable the company to carry out its business. IT infrastructure management, including addressing cyber security risks is exclusively done in the IT layer. There are several tools, such as FireEye, McAfee, Qualys, ArchSight and BMC Software which IT deploys and uses in order to identify and manage IT security risk, but something is missing.

A chasm exists between the IT layer and business layer, when looked at from a bottom-up perspective.



Every once in a while it’s good to take stock of a situation. A projected 1.25 billion Android users for 2015 (according to Gartner) is such a situation. Either your organisation is already an Android shop or it is likely to become one in the near future. A plethora of software apps for the Android OS and a decidedly spotty security record for many Android users means that reviewing your approach to Android security could be a wise move as well.

While advanced technology exists to help protect Android systems, a reminder about security basics can go a long way to avoiding problems:



That the cloud is a major boon to the enterprise is beyond question. At this point, it’s kind of like saying the CPU was a really good idea.

But no matter how valuable the cloud becomes, there will always be questions over its design, implementation and efficacy when it comes to specific applications and workloads. Already, cloud architectures have diverged along three distinct tracks – SaaS, PaaS and IaaS – and countless sub-tracks that use one or more of the three to achieve targeted goals, such as data center as a service, disaster recovery as a service and networking as a service.

So while increased use of enterprise-facing cloud services seems inevitable, the hows, wheres and whys of this transition are still unclear, which is why leading IT vendors like Intel are hoping to move things along.



Anyone who has been around the managed services market a while knows this: Companies can get pretty creative with their definitions of “managed services.”

While inventive definitions of the term may deliver food for thought or some level of entertainment, it’s hard to get customers to understand what a service delivers if providers can’t agree on its meaning. It’s no wonder, then, that even customers who hire an MSP don’t always know what “managed services” means.



The reinsurance industry has recently seen a rise in mergers and acquisitions among some of its biggest players, such as Axis Capital Holdings Ltd. and PartnerRe Ltd. Faced with challenges like soft market conditions and impending regulation around the globe, many companies have turned to consolidation. Case in point: In 2014, acquirers spent $17 billion on property and casualty, multi-line insurance and reinsurance deals – the most since 2011, according to data compiled by Bloomberg.

Claude Lefebrvre, chief underwriting officer at Hamilton RE, described M&A as part of a cycle that tends to take place during the soft market. Last year, about 390 insurance transactions were announced for a combined value of almost $50 billion, making it the busiest year for deals since 2008. This begs the question: Is bigger actually better?



(TNS) -- UNC officials are looking into what Chancellor Carol Folt termed a “completely unacceptable” failure of the system they use to warn students, faculty and staff of on-campus safety threats.

The review follows a pair of armed robberies that happened on campus at about 11 p.m. Wednesday, July 22.  Campus police are looking for two men who were in a white, four-door sedan and used handguns to threaten their victims.

Authorities implemented part of the alert system late that night, sounding sirens that by definition mean there’s an emergency somewhere on campus that requires people to go inside or take cover immediately.

The problem is that they’re also supposed to back up the sirens with Web bulletins, email, text messages or social media postings that explain what’s going on.

Those messages were 45 minutes late in trickling out.



Monday, 27 July 2015 00:00

What Is Community Resilience?

Fire, flood, famine, nuclear disaster — we’ve been through them all and more, and yet we so quickly forget. All but a few Americans, depending on which survey you read, remain stubbornly unprepared for the next disaster. Without preparedness, there can be no resiliency.

Insurer Allstate reports that 40 percent of Americans have thought about an evacuation plan, but just 8 percent have practiced an escape plan. Thirty percent say they’d take their chances and leave at the last minute in the face of a storm. More than half of parents say they’ve been directly impacted by disaster, according to Save the Children, yet 67 percent don’t know about the emergency plan at their kids’ schools, and 42 percent wouldn’t know where to find their kids after an evacuation.

Certainly things are better today than they used to be. “Fifty years ago there were no flood maps. Anyone could do whatever they wanted on a flood plain,” said Gene Whitney, a member of the Committee on Increasing National Resilience to Hazards and Disasters at the National Academy of Sciences/National Research Council. “Today communities are aware of the high-risk zones and they use those flood maps to guide their land-use decisions.”



Simplicity is the catch word behind the massive success of cloud-based file sharing services. The ability to access files and business applications without needing to invest in costly hardware installation and maintenance is a welcome relief for small, medium and large scale businesses alike. However, while cloud computing brings with it the joys of a simpler life, companies are likely to have some concerns before they put all their eggs in your cloud basket.

Here are some simple ideas on how to evaluate cloud providers as you "date" them before making a big commitment.



Monday, 27 July 2015 00:00

Big Changes in Store for Storage

New market data on the storage industry is out and the news does not look good for SAN, NAS and other forms of traditional data preservation.

According to a recent outlook from Wikibon, we are on the cusp of a digital extinction event as today’s complex network storage architectures give way to more nimble server-side solutions. The firm predicts that within 10 years, 90 percent of storage revenues will flow toward server SAN or hyperscale server SAN solutions, marking a 150 percent annual growth rate from today’s current market estimate of about $1 billion. At best, traditional SAN and NAS may eke out meager existences within long-term data retention infrastructure in which the frequency of data access is low but metadata retrieval is fairly steady.



Friday, 24 July 2015 00:00

Risk Must Be Personalized

Emergency preparedness isn’t about three days of water or extra batteries for your flashlight. If it were, we could stop investing in emergency preparedness campaigns and put the money toward buying 72-hour kits for every person in America. But we don’t, because that won’t make our communities more disaster resilient.

Preparing people for emergencies is about changing the way they think, not just before disasters, but also during them. What will make our communities more disaster resilient is to use emergency preparedness outreach as training for individuals to become effective disaster decision-makers: to teach them how to think in a crisis; to know what the disaster environment looks and feels like; to adapt; and to be empowered to take the necessary actions once decisions are made.

Effective disaster communication is not new territory. Researchers have been identifying ways to make risk and crisis communication more effective since the days of duck and cover. What’s missing is the practical application of those lessons in emergency management.



Gigantic baby steps--that’s one way to describe enterprise adoption of cloud applications.

On one hand, cloud adoption is happening faster than ever. On the other, companies are keeping themselves tightly tethered to their on-premise solutions.

While 93% of businesses were using at least one cloud application as of this year, the trend that has really taken off is the implementation of hybrid cloud environments (using both cloud and on-premise application deployments in tandem). As of this year, 82% of enterprise companies are using hybrid cloud strategies--up from 74% in 2014. This demonstrates that while organizations are fully prepared to take advantage of the benefits reaped when adopting cloud applications, the complexity of their on-premise deployments cannot be easily transferred to the cloud. The cloud isn’t always as “plug n’ play” as it is advertised to be.



Sometimes I think passwords are nothing but trouble for the security world. It seems like virtually every breach somehow involves passwords, whether it is because passwords are guessed to allow for the breach, or more often, the passwords are stolen and used in subsequent thefts. In April, while at the RSA Conference, I reported on a panel discussion that summed up the reason why passwords are still our primary form of authentication: We have a comfort level with them and no one is really that interested in change.

However, that attitude might actually be changing. New research from Accenture found that the majority of consumers are ready and willing to put aside passwords and try a different form of authentication. It’s a pretty large majority, too. According to the study, 60 percent of the 24,000 surveyed said they find the username/password combination to be cumbersome, while a whopping 77 percent said they’d be interested in using an alternative authentication method to protect their online information.



Friday, 24 July 2015 00:00

Lightning Fatalities Prompt Warning

The number of lightning deaths in the United States in 2015 continues to rise, the National Weather Service (NWS) has warned.

So far this year some 22 lightning fatalities have been recorded, just four shy of the 26 deaths recorded for the whole of 2014.

Alabama, Florida and Colorado top the states for lightning deaths in 2015 to-date with three lightning deaths each.

Lightning kills an average of 49 people in the U.S. each year, and hundreds more are severely injured, according to the NWS.



Friday, 24 July 2015 00:00

Social Media: The Next Level

Last November the emergency management team in Nashua, N.H., participated in a cross-border disaster preparedness exercise with Canadian agencies to evaluate how digital volunteers and social media can be incorporated in the official emergency response to address alerts, warnings and notifications as well as mutual aid.

A short time later, over Thanksgiving weekend, a powerful nor’easter hit New Hampshire, causing multiple accidents and power outages. “We ended up using skills learned during the exercise right away,” said Justin Kates, Nashua’s director of emergency management. “Through social media posts, our digital volunteers were tracking roads that were closed and compiling that info onto GIS maps to help first responders direct resources, clear trees from roads and restore power.”

Public information officers (PIOs) have used social media to share information with the public about disasters for years. But emergency management agencies are beginning to work on how to incorporate social media into operations to improve situational awareness for responders. And including social media in exercises is one way they’re building capacity and relationships, while also identifying best practices.



When an hour of downtime for a small to midsize business (SMB) can cost between $8,220 and $25,600, it’s obvious that your business needs a backup and recovery plan that gets data and systems back online quickly. After all, what company can lose that much money in such a short amount of time and stay in business?

For the third year, ChannelPro’s SMB readership has chosen backup and recovery company, Carbonite, as its Best Cloud Backup and Disaster Recovery Vendor. The company and its partners support over 1.5 million businesses and individuals by ensuring their data is safe and available even after a disaster. It provides cloud and hybrid cloud solutions for business continuity with an easy-to-use system for protecting and storing critical business data.

According to a testimonial on the Carbonite website, when one user’s hard drive died, all of their data was restored within just a few days. Having the reassurance that all business data can be restored allows many SMB owners to feel more at ease about trusting their business information to this technology.



Washington - Today, the Ad Council and the Department of Homeland Security’s Federal Emergency Management Agency (FEMA) announced the launch of a new public service advertisement (PSA) to raise awareness about the importance of being prepared for emergencies. While the PSA targets all communities, We Prepare Every Day is the first in a series of videos that aim to deliver a strong preparedness message by showing people with disabilities taking charge to prepare themselves and their families for emergencies.

The PSA provides equal access to all viewers and includes open captioning, a certified deaf interpreter, and audio description for viewers who are blind or have low vision.

“As we celebrate a quarter century of the ADA, we look to people with disabilities as leading the way,” said Craig Fugate, FEMA Administrator. “By taking their own preparedness actions every day, they set an example for all of us, including their families and their communities.”

The launch of the PSA coincides with the 25th anniversary of the Americans with Disabilities Act (ADA) on July 26, 2015. The ADA prohibits discrimination and ensures equal opportunity for people with disabilities in employment, state and local government services, public accommodations, commercial facilities, transportation and telecommunications. The ADA guarantees the civil rights of more than 56 million Americans.

“Everyone can and should think about their specific needs and prepare for the kinds of emergencies that can happen where they live, work or visit,” said Lisa Sherman, President and CEO of the Ad Council. “Our hope is that this campaign encourages everyone to think ahead and be prepared.”

The new PSA emphasizes the Ready Campaign’s four building blocks of preparedness - Build a Kit, Make a Plan, Be Informed and Get Involved. FEMA’s Ready campaign in partnership with the Ad Council has helped to generate more than 87 million unique visitors to the campaign’s website, Ready.gov, since its launch in 2003. Through the Ad Council, to date, the Ready campaign has received more than $1.1 billion in donated media.

To get more information on how to make a family emergency communication plan, building a disaster supply kit or to learn how to get involved in community preparedness, please visit ready.gov/myplan. The PSA was created pro-bono by Free Range Studios and will be available for download from FEMA’s media library.


Federal Emergency Management Agency
FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. Learn more at fema.gov.

The Ad Council
The Ad Council is a private, non-profit organization with a rich history of marshaling volunteer talent from the advertising and media industries to deliver critical messages to the American public. Having produced literally thousands of PSA campaigns addressing the most pressing social issues of the day, the Ad Council has affected, and continues to affect, tremendous positive change by raising awareness, inspiring action and saving lives. To learn more about the Ad Council and its campaigns, visit www.adcouncil.org, like us on Facebook, follow us on Twitter or view our PSAs on YouTube.

Free Range
Free Range is a world class brand and innovation studio with a commitment to driving positive social change through storytelling and design. Based in Oakland, CA and Washington D.C., Free Range has been named one of Fast Company’s Fast 50 most innovative companies and has won numerous Webbys, Addys and Sundance Interactive Awards. To learn more, visit FreeRange.com.

Andrew MacLeod, MBCI, investigates the origins of the term ‘resilience’ and demonstrates how its meaning, context and utility has evolved in the last 30 years.  This is the second paper in a series where we are publishing the short listed entries in the Continuity Central Business Continuity Paper of the Year competition.

As Napolitano (2010), the US Secretary of Homeland Security observed,

“… we are a resilient nation. But … we can’t guarantee there won’t be another successful terrorist attack … if that attack comes, our enemies will still not have succeeded, because our nation is too strong, and too resilient, to ever cower before a small group of violent extremists.”

The burgeoning use of ‘resilience’ has created a “concept used liberally and enthusiastically by policy makers, practitioners and academics” (McAslan, 2010). A Google search returns over five million references for ‘resilience’, and even the laconic  Geoffrey Boycott, now refers to England’s cricket team as lacking ‘resilience in the middle order’. There has been significant debate about the relationships between risk, business continuity, disaster recovery and crisis management. Resilience has the potential to be an umbrella term which encompasses these disciplines. Therefore, precise understanding of the contemporary meaning of resilience is fundamental, lest it become an inappropriately applied term such as ‘strategic’. This paper will investigate the origins of the term resilience and demonstrate how its meaning, context and utility has evolved in the last 30 years. Resilience and its utility will be examined in relation to a number of sectors; environmental, individual, community, organizational, and national security. It will be demonstrated that there are numerous definitions of resilience, which are contextually sensitive. Nevertheless, the term resilience underpins a mind-set, a common set of characteristics and an ability to recover no matter the context.



One of the bugbears of IT network security is the denial of service (DOS) attack. Instead of (or as well as) trying to sneak past a firewall with a few innocent-looking data packets, the DOS attack tries to cripple a network or a system by swamping it out. In the case of network firewalls, the attacker will try to generate as much network traffic as possible to overload the firewall’s processing power. Attackers often multiply the sources of the network traffic for that reason, leading to distributed denial of service (DDOS) attacks. Firewalls that are submerged by traffic may become unmanageable, unless the vendor has taken suitable design precautions, which might also inspire good business continuity in general.



There are 50 stars on our U.S. flag representing the 50 states that make up the Union. But when it comes to emergency management there are 100 states, not 50.

No, I’m not using some form of new math. What I’m referring to is the juxtaposition of rural and urban areas that exists in each state. Every state has at least one urban area. Some, like Florida and California, have more than one. Other states have one large urban area that dominates the politics, infrastructure, resources and attention of business, industry and state-level politicians. New York has New York City and upstate. Illinois has Chicago and then the rest of the state. Even a state like Nebraska has Omaha versus the more rural areas.

Emergency management is not immune from these urban versus rural differences. Perhaps the biggest disparity is the number of resources, generally meaning money, but that translates quickly into funding for staffing and the number of program areas that can be supported. In many ways these 100 state emergency management “districts,” which I’ll call urban and rural, use different methods to achieve success.



A cyberattack targeting the U.S. power grid would have widespread economic implications, resulting in insurance claims of between $21.4 billion and $71.1 billion in a worst case scenario, according to a report by Lloyd’s.

Lloyd’s and the University of Cambridge’s Centre for Risk Studies recently released “Business Blackout,” which examines the insurance implications of a major cyberattack using the U.S. power grid as an example. In the scenario outlined, malware is used to infect control rooms for generating electricity in areas of the Northeastern U.S. The malware goes undetected and locates 50 generators that it can control, forcing them to overload and burn out. The scenario, described as “improbable but technologically possible,” leaves 15 states in darkness, meaning that 93 million people are without power.

Economic impacts include direct damage to assets and infrastructure, decline in sales revenue to electricity supply companies, loss of sales revenue for businesses and disruption to the supply chain. The total impact to the U.S. economy is estimated at $243 billion, rising to more than $1 trillion in the most extreme version of the scenario.



(TNS) -- University of Texas researchers have been awarded a $13.7 million federal grant to develop a software platform and other cybertools to help engineers construct buildings, levees, bridges, highways and other structures that are better able to withstand earthquakes and other natural hazards.

“There is tremendous potential to save lives and property through better engineering, design and planning,” said Ellen Rathje, a civil engineering professor and the project’s principal investigator.

The grant from the National Science Foundation, to be paid out over five years, will fund development of a Web platform, data repository and other tools that will allow engineers to simulate how various designs of structures, including residential housing, would hold up in an earthquake, hurricane, tornado or coastal storm surge, Rathje said.



No one thought about data standards when the Jack in the Box E. coli epidemic erupted in 1993. Instead, there was panic as the stomach-clenching illness engulfed more than 700 victims across California, Washington, Idaho and Nevada. The strain of bacteria, transmitted through undercooked beef patties, left more than 170 with permanent kidney and brain damage. Most of these were children, and tragically, four died as a result.

For Sarah Schacht, Socrata’s Public Health Data Advisor, the national epidemic resonates in a personal way.

“I’m a two-time E. coli survivor,” Schacht recalled. During the Jack in the Box outbreak, she contracted the disease at the age of 13 along with her 5-year-old brother. And in 2013, she was diagnosed yet again after dining at a Seattle restaurant.



(TNS) - Turns out that it’s not as easy as you might think to transform what had been the yards of hundreds of flood-ruined homes into ball fields in the city’s emerging riverside greenway.

Proof of that is the heavy equipment working along the river in Time Check and at Czech Village, scraping up topsoil so it can be sifted, screened, and cleaned before being put in place to make practice ball fields for football, soccer, and other activities.

Pieces of demolition debris, glass, steel, sewer tile, roots, rocks and much else has been screened from the soil so sharp edges don’t tear up youngsters when the place where blocks of homes once stood becomes practice fields, said Steve Krug, landscape architect for the Department of Parks and Recreation.



Data breaches and cyberattacks happen daily, across industries and to businesses of all sizes. However, as these attacks become more sophisticated, companies admit that they are at a loss on how to best protect the data. According to eWeek, a study from RSA shows that those responsible for protecting the network don’t necessarily trust their information security capabilities.

The Cybersecurity Poverty Index survey revealed that four in 10 companies admitted that their security capabilities were “functional,” or, in terms of the survey, average. In all, approximately 75 percent of the 400 companies interviewed confessed that their security abilities were either average or below average when compared to the standards suggested by the Cybersecurity Framework, which was developed by the U.S. National Institute of Standards and Technology.

The RSA study used five areas to measure information security capabilities, as eWeek reported:

The five components of an information-security program include identifying threats, protecting information assets, detecting attacks, responding to incidents and recovering from compromises.



(TNS) - In the heat of a major catastrophe, getting critical information to the public is crucial to saving lives and establishing trust.

At FEMA’s Emergency Management Institute, 60 Macon-Bibb County leaders are learning the challenges communities face when not only local folks, but the eyes of the world are looking to them for news.

“If people don’t know what’s happening and what to do, then they are not going to respond accordingly,” said Pam Collins, a public information specialist and FEMA adjunct instructor.

During Tuesday morning’s briefing, Collins urged the representatives of Macon-Bibb government and private sector agencies and organizations to have plans in place to ensure the public has the information they need or they will turn to other unofficial and sometimes inaccurate sources, such as social media.



If you have a chief information security officer working for your company, chances are that the rest of the executive leadership team wholly undervalues their contribution to the organization.

Unsurprisingly, being in charge of data safety is a massively under-appreciated role by other C-level executives, according to a new study from ThreatTrack Security. The company recently released its second annual Role of The CISO study, which surveyed 200 C-level executives at U.S. enterprises with a chief information security officer about the importance of having such an individual managing the company’s sensitive information.



Once a month I use my blog to highlight some of S&R’s most recent and trending research. This month I’m focusing on application security and asking for your help with some of our upcoming research into the security and privacy risks associated with Internet of Things (IoT). IoT is any technology that enables devices, objects, and infrastructure to interact with monitoring, analytics, and control systems over the Internet. The illustrious and debonair, Tyler Shields (@txs), will lead our research into IoT security, but as the risks become more and more concrete for various verticals, you can expect the entire team to engage in this research.

Take our IoT security survey and talk with our analysts! If you contribute to the emerging IoT market, please fill out this brief survey (http://forr.com/2015-IoT-Security-Survey). Participants will receive a complimentary copy of the completed research report and we'd be happy to interview anyone who would like to discuss IoT and security in detail. Be sure to reach out to Tyler (This email address is being protected from spambots. You need JavaScript enabled to view it.) or Jennie Duong (This email address is being protected from spambots. You need JavaScript enabled to view it.) if you’re interested.



Last week, a New Yorker article about a catastrophic earthquake predicted for the Northwest — that will unleash its fury and “spell the worst natural disaster in the history of the continent” — stoked our nation’s collective fears about a disaster not unlike those seen in Hollywood blockbusters. And while stories such as this incite a high level of anxiety in the public, they also motivate people to start the huge undertaking of creating a resilient community that could respond and recover from a disaster of this magnitude. 

What is a resilient community? The $100 million Rockefeller Foundation project called 100 Resilient Cities defines it as “the capacity of individuals, communities, institutions, businesses and systems within a city [or community] to survive, adapt and grow no matter what kinds of chronic stresses and acute shocks they experience.” 

As a civic tech entrepreneur and founder of Appallicious, I have worked with the White House, the Federal Emergency Management Agency (FEMA), nongovernmental organizations, universities, foundations, responders and local governments for a year and a half on a project to leverage technology and data to help communities respond to and recover from a disaster. I listened intently and took the best recommendations, ideas, theories and practices from all of these thought leaders and worked to integrate their ideas into to a customizable platform for daily and catastrophic events. What started as the first full-life cycle Disaster Assistance and Assessment Dashboard (DAAD) has been transformed through extensive iterative stakeholder development and design sessions into the Community Resilience Platform (CRP). The CRP is the first daily use, customizable, white label preparedness, planning, response and recovery platform developed for communities to build their own regional and local resilience platforms.



Teaching prospects about the Health Insurance Portability and Accountability Act (HIPAA) could help managed service providers (MSPs) boost their revenues, according to RapidFire Tools.

The company behind the Network Detective application and reporting tool this week released a survey that revealed many MSPs are using HIPAA compliance assessments to increase business and better engage prospects.



If there’s one conversation that invariably creates a lot of hand-wringing among the IT professionals I’ve spoken with in recent years, it’s the one that centers on “shadow IT.” Buying and implementing technology independent of the IT organization—a practice that is probably most widely associated with marketing organizations—raises all sorts of hackles among these IT pros, and they’re not afraid to share their thoughts on why it’s a bad idea.

But there’s a fascinating dimension to all of this. Gartner has famously predicted that by 2017, marketing organizations will spend more on technology than IT organizations themselves spend. If that’s the case, it seems to me there’s a question that’s begging to be asked: If it’s marketing that’s driving the tech spending, then who’s the substance, and who’s the shadow?

I recently had the opportunity to discuss the marketing vs. IT topic with Chris Vennitti, vice president, contract staffing services at the HireStrategy subsidiary of Addison Group, a Chicago-based staffing and recruitment firm that specializes in IT. Vennitti lives and breathes this stuff, so I opened the conversation with the notion that the way things are going these days, you really do have to wonder which is the shadow—the marketing organization or the IT organization. I asked Vennitti for his thoughts on that, and he clearly accepted the legitimacy of the question:



British businesses are beginning to take a more sophisticated approach to disaster recovery (DR) planning, but most still fail to test their provisions frequently enough.

This is according to a new report from Plan B, in which 200 IT professionals and decision-makers at UK firms were polled on their DR practices.

It found that many are adopting hybrid DR plans, using a wide range of in-house and outsourced solutions to suit the “budget and criticality” of different IT systems.

However, fewer than a third (31 per cent) of respondents test their plans more than yearly, the researchers found, and just one in five (21 per cent) do so “properly” – trialling every component of their DR strategy in a single dry run.

This could put them at risk of missed recovery time objectives due to unforeseen bottlenecks, as well as data loss due to incomplete backups.

“Buyers are getting smarter, which is really good news for the business continuity world, but we still need to promote testing as an area to take more seriously to reduce IT downtime,” Plan B managing director Tim Dunger told Computer Weekly.

It is wise to choose a data recovery company who has a track record in recovering from the type of data loss you have experienced.

From:: http://www.krollontrack.co.uk/company/press-room/data-recovery-news/report-uk-firms-not-testing-dr-plans-frequently-enough795.aspx

The updating of NFPA 1600, the Standard on Disaster/Emergency Management and Business Continuity Programs, has reached a new stage. The Second Draft Report has now been posted and NITMAMs are now being accepted.

Under NFPA rules, anyone wishing to make an allowable amending motion at an NFPA Technical Meeting must declare their intentions by filing, within the published deadline, a NITMAM (Notice of Intent to Make a Motion).

The Motions Committee of the NFPA Standards Council, in accordance with NFPA rules, then reviews each NITMAM to determine whether the intended motion is a proper motion.

The deadline for NFPA 1600 new edition NITMAMs is August 21st 2015.


The Federal Emergency Management Agency provides two main types of assistance following natural disasters, such as the Texas storms, tornadoes, straight-line winds and flooding that occurred May 4 through June 19.

Individual Assistance is provided by the Federal Emergency Management Agency (FEMA) to individuals and families who have sustained losses due to disasters.

  • Texas homeowners, renters and business owners in designated counties who sustained damage to their homes, vehicles, personal property, businesses or inventory as a result of the May 4 through June 19 severe storms and floods may apply for disaster assistance.
  • Disaster assistance may include grants to help pay for temporary housing, emergency home repairs, uninsured and underinsured personal property losses, and medical, dental and funeral expenses caused by the disaster, along with other serious disaster-related expenses.
  • Disaster assistance grants are not taxable income and will not affect eligibility for Social Security, Medicaid, medical waiver programs, welfare assistance, Temporary Assistance for Needy Families, food stamps, Supplemental Security Income or Social Security Disability Insurance.
  • As a FEMA partner, the U.S. Small Business Administration (SBA) offers low-interest disaster loans to businesses of all sizes, private non-profit organizations, homeowners and renters. SBA disaster loans are the primary source of federal long-term disaster recovery funds for disaster damages not fully covered by insurance or other compensation. They do not duplicate benefits of other agencies or organizations.

Public Assistance can fund the repair, restoration, reconstruction or replacement of a public facility or infrastructure damaged or destroyed by a disaster.

  • FEMA will provide a reimbursement grant of 75 percent of eligible costs, with the state and local governments sharing the remaining 25 percent of costs. Eligible entities include state governments, local governments and any other political subdivision of the state, Native American tribes and Alaskan Native Villages. Certain private nonprofit organizations, such as educational, utility, irrigation, emergency, medical, rehabilitation, and temporary or permanent custodial care facilities also may receive assistance.
  • Although funds are awarded to government entities and nonprofits, the Public Assistance program is intended to benefit everyone — neighborhoods, cities, counties and states. Public Assistance dollars help clean up communities affected by disaster-related debris, repair the roads and bridges people use every day getting to work and school, put utilities and water systems back in order, repair hospitals and emergency services, rebuild schools and universities, and restore playground equipment in public parks.


All FEMA disaster assistance will be provided without discrimination on the grounds of race, color, sex (including sexual harassment), religion, national origin, age, disability, limited English proficiency, economic status, or retaliation. If you believe your civil rights are being violated, call 800-621-3362 or 800-462-7585(TTY/TDD).

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. 

The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling 800-659-2955, emailing This email address is being protected from spambots. You need JavaScript enabled to view it., or visiting SBA’s website at www.sba.gov/disaster. Deaf and hard-of-hearing individuals may call 800-877-8339.

FEMA’s temporary housing assistance and grants for childcare, medical, dental expenses and/or funeral expenses do not require individuals to apply for an SBA loan. However, those who receive SBA loan applications must submit them to SBA to be eligible for assistance that covers personal property, transportation, vehicle repair or replacement, and moving and storage expenses.

For more information on Texas recovery, visit the disaster web page at www.fema.gov/disaster/4223, Twitter at https://www.twitter.com/femaregion6 and the Texas Division of Emergency Management website, https://www.txdps.state.tx.us/dem.

Visit www.fema.gov/texas-disaster-mitigation for publications and reference material on rebuilding and repairing safer and stronger.

Can you share too much information (TMI) online? As organizations use more public cloud services, IT service providers should be careful about what cloud-based file sharing services they recommend. And, while the rise in public and hybrid clouds isn’t a bad thing, as an MSP you need to explain to your clients the risks associated with the public cloud so they can make an educated decision about what mix of public and private cloud services is right for them. Here are a few of the risks you should highlight:



CVS (CVS) last week notified CVSphoto.com customers that the independent vendor managing online payments for the website may have suffered a credit card breach.

And as a result, CVS tops this week's list of IT security news makers, followed by University of California, Los Angeles (UCLA) Health SystemUniversity of Pittsburgh Medical Center (UPMC) Health Plan and Symantec (SYMC).

What can managed service providers (MSPs) and their customers learn from these IT security news makers? Check out this week's list of IT security stories to watch to find out:



WASHINGTON — In the month since a devastating computer systems breach at the Office of Personnel Management, digital Swat teams have been racing to plug the most glaring security holes in government computer networks and prevent another embarrassing theft of personal information, financial data and national security secrets.

But senior cybersecurity officials, lawmakers and technology experts said in interviews that the 30-day “cybersprint” ordered by President Obama after the attacks is little more than digital triage on federal computer networks that are cobbled together with out-of-date equipment and defended with the software equivalent of Bubble Wrap.

In an effort to highlight its corrective actions, the White House will announce shortly that teams of federal employees and volunteer hackers have made progress over the last month. At some agencies, 100 percent of users are, for the first time, logging in with two-factor authentication, a basic security feature, officials said. Security holes that have lingered for years despite obvious fixes are being patched. And thousands of low-level employees and contractors with access to the nation’s most sensitive secrets have been cut off.



Monday, 20 July 2015 00:00

Top Ten Tips for DR as a Service

In this article, we provide tips for what can be a particularly challenge task: deciding when and how to implement DRaaS in the enterprise.

Buy It, Don’t Hire It 

Some organizations already have an in-house team with the necessary expertise to establish and maintain a sophisticated DR plan. But plenty of others don’t even come close. In those cases, it is probably easier to buy the necessary DR technology and resources from the cloud than to try to hire it and build it in house.

“DRaaS is often a good fit for small to midsize businesses that lack the necessary expertise to develop, configure, test, and maintain an effective disaster recovery plan,” said Wayne Meriwether, an analyst for IT research firm Computer Economics.



“The Internet of Things is the biggest game changer for the future of security,” emphasizes David Bennett, vice president of Worldwide Consumer and SMB Sales at Webroot. “We have to figure out how to deal with smart TVs, printers, thermostats and household appliances, all with Internet connectivity, which all represent potential security exposures.”

Simply put, the days of waiting for an attack to happen, mitigating its impact and then cleaning up the mess afterward are gone. Nor is it practical to just lock the virtual door with a firewall and hope nothing gets in--the stakes are too high. The goal instead must be to predict potential exposure, and that requires comprehensive efforts to gather threat intelligence. According to Bennett, such efforts should be:



(TNS) - Few countries know how to deal with widespread disaster better than Japan, and on Thursday, Japanese firefighter Junichi Matsuo told his Yakima Valley counterparts what it was like to respond to the devastating 2011 earthquake and tsunami that killed more than 13,000 people.

“That was the first time I’d ever seen such a terrible situation,” said Matsuo, a veteran firefighter with decades of emergency response experience.

But the disaster also held lessons on the importance of community planning and community involvement in responding to a crisis, he said.

The magnitude 9.0 earthquake that struck March 11, 2011, was the most powerful recorded earthquake ever to hit Japan and the fourth-strongest worldwide since modern record-keeping began in 1900.



The cost and consequence of a product recall

The number of food recalls and their costs to business are rising according to a new publication by Swiss Re which highlighted that since 2002, the annual number of recalls in the US has almost doubled. Food contamination costs US health authorities US$ 15.6 billion per year (nearly nine million Americans became sick from contaminated food in 2013 alone) and half of all food recalls cost the affected companies more than US$ 10 million.

Food manufacturers operate in a vast, globalised supply chain, making risk management for food recalls more difficult, yet one mislabelled product or contaminated ingredient can cause sickness, death, multi-million dollar losses and massive reputational damage for the affected companies. Swiss Re's Food Safety in a Globalised World examines how the increasing number food recalls is impacting consumers, public health services, governments and companies globally.

Product quality incidents or product safety incidents may not have been identified as a major threat to organizations according to the Business Continuity Institute’s latest Horizon Scan Report, but they do still raise some concerns among business continuity professionals. 26% of respondents to a survey expressed either concern or extreme concern about the prospect of a product quality incident that would disrupt the organization and 19% expressed the same level of concern over a product safety incident.

The latest Supply Chain Resilience Report produced by the BCI revealed that 40% of respondents to a survey claimed their supply chain had been impacted upon by a product quality incident during the previous twelve months. Many of these did suggest that the impact was low, but there was still an impact that can be disruptive.

"In a more globalised economy, ensuring the highest level of food safety is becoming an ever greater challenge for firms," says Jayne Plunkett, Head of Casualty Reinsurance at Swiss Re. "Today ingredients and technologies are sourced worldwide. This leads to greater complexity for food manufacturers and consumer and regulatory demands on companies are continually increasing."

As cyber threats emerge and evolve each day, they pose challenges for organizations of all sizes, in all industries. Even though most industries are investing heavily in cybersecurity, many companies are still playing catch up, discovering breaches days, months, and even years after they occur. The 2015 Verizon DBIR shows that this “detection deficit” is still increasing: The time taken for attackers to compromise networks is significantly less than the time it takes for organizations to discover breaches.

The risk posed by third parties complicates the issue further. How can an organization allocate time and resources to trust their partners’ security when they are struggling to keep up with their own? Over the years, audits, questionnaires, and penetration tests have helped to assess third party risk. However, in today’s ever-changing cyber landscape, these tools alone do not offer an up-to-date, objective view. While continuous monitoring solutions can improve detection and remediation times for all organizations, the retail, healthcare, and utilities industries can especially benefit from greater adoption.



Over the next few weeks the shortlisted articles and papers in Continuity Central’s Business Continuity Paper of the Year competition will be published, with the winner announced after that. This is the first shortlisted paper, written by Ken Simpson, FBCI.

Are you looking to build a high-performing team? Where each member understands their role, and how they fit with other team members’ roles? A team that can execute on the prepared game plan - while at the same time has the capability to improvise as the situation warrants?

That description might be something your business continuity, incident response and/or crisis management teams aspire to - or it may be just as appropriate a goal for your ‘business as usual (BAU)’ functional teams. In any case it applies to teams that seek to compete in elite level sports and perhaps we can learn something about how to prepare teams from the methods used in the sporting domain.

The nature of training and preparation changes as players and teams move from the participation and social levels of sport into elite competitions. Basic drills, sloppy execution and general fitness regimes are replaced with targeted training programs - building high-level skills, disciplined execution and embedding team concepts.



You may have read that the Justice Department is warning food manufacturers that they could face criminal and civil penalties if they poison their customers with contaminated food.

Recent high profile food recalls, such as the one at Texas-based Blue Bell Creameries and another at Ohio-based Jeni’s Splendid Ice Creams, have drawn attention to this issue once again.

Now a new report by Swiss Re finds that the number of food recalls per year in the United States has almost doubled since 2002, while the costs are also rising.

Half of all food recalls cost the affected companies more than $10 million each and losses of up to $100 million are possible, Swiss Re says. These figures exclude the reputational damage that may take years for a company to recover from.



(TNS) - The mayors of four communities in south Mississippi weren't so eager at first to recall the events of 10 years ago, when they were new to the job and Hurricane Katrina had devastated their cities. But during a program Wednesday, they remembered the destruction, the people who came to help ­-- and the chickens.

Those in the audience of the Katrina +10 presentation at the Ohr-O'Keefe Museum of Art nodded as they remembered with the mayors how it was in the days after the storm and laughed at some of their stories.

Moderator Joe Spraggins had just retired as a brigadier general in the Air Force and said he asked the Lord, "I want to have a challenge in my next career."

His first day on the job as the head of emergency operations for Harrison County was Aug. 29, 2005 ­-- the day Katrina hit.



Climate markers continue to show global warming trend

State of the Climate in 2014 report available online. (Credit: NOAA).

State of the Climate in 2014 report available online. (Credit: NOAA)

In 2014, the most essential indicators of Earth’s changing climate continued to reflect trends of a warming planet, with several  markers such as rising land and ocean temperature, sea levels and greenhouse gases ─ setting new records.  These key findings and others can be found in the State of the Climate in 2014 report released online today by the American Meteorological Society (AMS).

The report, compiled by NOAA’s Center for Weather and Climate at the National Centers for Environmental Information is based on contributions from 413 scientists from 58 countries around the world (highlight, full report). It provides a detailed update on global climate indicators, notable weather events, and other data collected by environmental monitoring stations and instruments located on land, water, ice, and in space.  

“This report represents data from around the globe, from hundreds of scientists and gives us a picture of what happened in 2014. The variety of indicators shows us how our climate is changing, not just in temperature but from the depths of the oceans to the outer atmosphere,” said Thomas R. Karl, L.H.D, Director, NOAA National Centers for Environmental Information.

For State of the Climate in 2014 maps, images and highlights, visit Climate.gov. (Credit: NOAA).

For State of the Climate in 2014 maps, images and highlights, visit Climate.gov. (Credit: NOAA)

The report’s climate indicators show patterns, changes and trends of the global climate system. Examples of the indicators include various types of greenhouse gases; temperatures throughout the atmosphere, ocean, and land; cloud cover; sea level; ocean salinity; sea ice extent; and snow cover. The indicators often reflect many thousands of measurements from multiple independent datasets.

“This is the 25th report in this important annual series, as well as the 20th report that has been produced for publication in BAMS,” said Keith Seitter, AMS Executive Director. “Over the years we have seen clearly the value of careful and consistent monitoring of our climate which allows us to document real changes occurring in the Earth’s climate system.”

Key highlights from the report include:

  • Greenhouse gases continued to climb: Major greenhouse gas concentrations, including carbon dioxide, methane and nitrous oxide, continued to rise during 2014, once again reaching historic high values. Atmospheric CO2 concentrations increased by 1.9 ppm in 2014, reaching a global average of 397.2 ppm for the year. This compares with a global average of 354.0 in 1990 when this report was first published just 25 years ago.
  • Record temperatures observed near the Earth’s surface: Four independent global datasets showed that 2014 was the warmest year on record. The warmth was widespread across land areas. Europe experienced its warmest year on record, with more than 20 countries exceeding their previous records. Africa had above-average temperatures across most of the continent throughout 2014, Australia saw its third warmest year on record, Mexico had its warmest year on record, and Argentina and Uruguay each had their second warmest year on record. Eastern North America was the only major region to experience below-average annual temperatures.
  • Tropical Pacific Ocean moves towards El Niño–Southern Oscillation conditions: The El Niño–Southern Oscillation was in a neutral state during 2014, although it was on the cool side of neutral at the beginning of the year and approached warm El Niño conditions by the end of the year. This pattern played a major role in several regional climate outcomes.  
  • Sea surface temperatures were record high: The globally averaged sea surface temperature was the highest on record. The warmth was particularly notable in the North Pacific Ocean, where temperatures are in part likely driven by a transition of the Pacific decadal oscillation – a recurring pattern of ocean-atmosphere climate variability centered in the region.
  • Global upper ocean heat content was record high: Globally, upper ocean heat content reached a record high for the year, reflecting the continuing accumulation of thermal energy in the upper layer of the oceans. Oceans absorb over 90 percent of Earth’s excess heat from greenhouse gas forcing.
  • Global sea level was record high: Global average sea level rose to a record high in 2014. This keeps pace with the 3.2 ± 0.4 mm per year trend in sea level growth observed over the past two decades.
  • The Arctic continued to warm; sea ice extent remained low: The Arctic experienced its fourth warmest year since records began in the early 20th century. Arctic snow melt occurred 20–30 days earlier than the 1998–2010 average. On the North Slope of Alaska, record high temperatures at 20-meter depth were measured at four of five permafrost observatories. The Arctic minimum sea ice extent reached 1.94 million square miles on September 17, the sixth lowest since satellite observations began in 1979. The eight lowest minimum sea ice extents during this period have occurred in the last eight years.
  • The Antarctic showed highly variable temperature patterns; sea ice extent reached record high: Temperature patterns across the Antarctic showed strong seasonal and regional patterns of warmer-than-normal and cooler-than-normal conditions, resulting in near-average conditions for the year for the continent as a whole. The Antarctic maximum sea ice extent reached a record high of 7.78 million square miles on September 20. This is 220,000 square miles more than the previous record of 7.56 million square miles that occurred in 2013. This was the third consecutive year of record maximum sea ice extent. 
  • Tropical cyclones above average overall: There were 91 tropical cyclones in 2014, well above the 1981–2010 average of 82 storms. The 22 named storms in the Eastern/Central Pacific were the most to occur in the basin since 1992. Similar to 2013, the North Atlantic season was quieter than most years of the last two decades with respect to the number of storms.

The State of the Climate in 2014 is the 25th edition in a peer-reviewed series published annually as a special supplement to the Bulletin of the American Meteorological Society. The journal makes the full report openly available online.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.

The Office of Personnel Management (OPM) breach is in the news again. As you may have heard, it is much worse than originally thought, with nearly 22 million records compromised. With this news, this breach is the second one in less than three months that has hit a little too close to home for me personally.

It’s also not surprising. Our government is ridiculously lax in its cybersecurity efforts, especially when you consider the amount of personally identifiable information held in government databases. Remember, the OPM breach didn’t just have Social Security numbers and birthdates. PII revealed also included things like fingerprints and findings from security clearance investigations. The stealing of this data has created a new level of identity theft problems for the individuals affected, according to the security experts at NuData, who provided the following commentary to me in an email:



Thursday, 16 July 2015 00:00

BCI: Facing a skills shortage

A new study by the Confederation of British Industry and Pearson has shown that demand for higher-level skills in British industry is set to grow in the years ahead, with sectors central to future growth – manufacturing and construction – particularly hard-pressed.

The Education and Skills Survey highlighted that over two-thirds of businesses (68%) expect their need for staff with higher level skills to grow in the years ahead, but more than half of those surveyed (55%) fear that they will not be able to access enough workers with the required skills.

Availability of talents/key skills may not have been the greatest threat to organizations according to the Business Continuity Institute’s latest Horizon Scan Report, but it is still a threat. 43% of business continuity professionals surveyed expressed either concern or extreme concern about the prospect of their organization suffering from a lack of availability.

Katja Hall, Deputy Director-General at the CBI, said: “The Government has set out its stall to create a high-skilled economy, but firms are facing a skills emergency now, threatening to starve economic growth. Worryingly, it’s those high-growth, high-value sectors with the most potential which are the ones under most pressure."

Rod Bristow, President of Pearson’s UK business, said: “Better skills are not only the lifeblood of the UK economy – as fundamental to British business as improving our infrastructure, technology and transport links – they are also critical to improving young people's life chances, of enabling them to be a success in life and work."

The virtual data center is one of those things that sounded like a great idea at first, only to lose much of its appeal upon reflection. But while few organizations are pursuing a fully abstracted, end-to-end data environment, it appears that many data processes will benefit tremendously by not having to rely on integrated hardware/software infrastructure.

The virtual data center has gotten a boost from a number of key software developments lately that remove much of the complexity in creating functional data stacks in either on-premises or third-party clouds. One is the Mesosphere Datacenter Operating System (DCOS), which recently saw the release of a software development kit that allows cluster-wide installation and operation of Java, Go and Python services using a simple web or command-line interface. The system features a range of schedulers for various application types, such as long-term micro services, batch processing and storage, allowing enterprises to custom-build data frameworks to support highly specialized functions.



‘Banana Skins’ poll reflects industry risk perception

A new survey charting the top risks in the global insurance sector shows that cyber risk and interest rates are now among the top risks for insurers. Their entry, new into the rankings of this fifth successive survey, are indicative of how high a concern they have become for the industry when looked at in conjunction with regulatory developments and the broader macro-economy.

The CSFI’s latest ‘Insurance Banana Skins 2015’ survey, conducted in association with PwC, polled over 800 insurance practitioners and industry observers in 54 countries, to find out where they saw the greatest risks over the next 2-3 years.

Regulatory risk emerged as the overall top risk for participants in the survey for the third successive time, underlining the deep impact regulatory change is having.



(TNS) - Twenty years ago this week, Chicago was gripped by one of the city's worst natural disasters: a scorching heat wave that claimed more than 700 victims, mostly the poor, elderly and others on society's margins.

The temperature hit 106 degrees on July 13, 1995, and would hover between the high 90s and low triple digits for the next five days. Dozens of bodies filled the Cook County medical examiner's office. On a single day — July 15 — the number of heat-related deaths reached its highest daily tally of 215; refrigerated trucks were summoned to handle the overflow of corpses.

Two decades later, the collective failings that contributed to the death toll are now well-documented: a city caught off guard, social isolation, a power grid that couldn't meet demand and a lack of awareness on the perils of brutal heat.



Thursday, 16 July 2015 00:00

MSPs, Don't Ignore Cloud Opportunities

Just as the IT channel was getting comfortable a half-dozen years ago with managed services, another new service model was vying for recognition – the cloud. Many MSPs have since added cloud-based services, but some still struggle with how to go about it.

If you ask Michael Corey why, the founder and president of Dedham, Massachusetts-based MSP Ntirety will tell you one of the main obstacles is self-imposed: IT service providers fear cloud-based services will cannibalize parts of their businesses. They’ve made money delivering services in a certain way for so long that the idea of replacing it with a cloud model scares them.



Don’t Fall into the Vendor Lock-In Trap of Hyper-convergence

About two years ago, I wrote a Blog (Storage Vendor Lock-in – Is the End Near?) that discusses how two emerging technologies, convergence and VM-aware storage, and more importantly the synergy among them, may provide the relief from vendor lock-in. Two years later, these two technologies have matured quite a bit and the synergy among them, widely referred to as hyper-convergence, is a pretty hot trend in IT.

For many customers, flexibility and avoiding vendor lock-in are primary concerns and a key reason for considering hyper-convergence. While all of us at Maxta have been busy improving our hyper-converged solutions and maintaining them to be flexible and free of vendor lock-in, this is not the case for some of our competitors. Unfortunately, some vendors are not leveraging the inherent potential of hyper-convergence to reduce vendor lock-in. Moreover, others are making moves to increase vendor lock-in to their own offerings.



It’s no surprise that in today’s world, data grows by leaps and bounds daily. In fact, IDC and EMC report that global data will increase “by 50 times by 2020.” With the use of mobile devices, social networks and cloud applications, all businesses, large and small, can benefit from capturing and analyzing consumer and business data. Several companies have come forward with BI solutions for such businesses in recent months.

Most recently, Quatrro Business Support Services has created a leading-edge new business intelligence (BI) and financial analytics tool to help small to midsize businesses (SMBs) gather unstructured data and use it to make informed business decisions.

The BI Tool features financial dashboards, reporting templates and alerts to assist SMBs in making sense of the mounds of unstructured data they collect. According to PCWorld, SMBs will benefit from the BI Tool’s analysis and planning features to set up benchmarking and unit comparisons when attempting to identify trends in a market. It can also help with budgeting, forecasting and predictive analysis, which can give SMBs the ability to grow and expand.



He declined to live tweet his upcoming wedding from the altar, but there is no doubt that Nick Hayes is the social media expert on Forrester’s S&R team. He has extensive knowledge of the security, privacy, archiving, and compliance challenges of social media, as well as the technical controls used to address them. He also specializes in the tools that monitor and analyze social data to improve oversight and mitigation tactics of myriad reputational, third-party, security, and operational risks. He is certainly aware of the reputational risk of staring at your cell phone when you’re supposed to say, “I do”, but maybe if you follow him (@nickhayes10), you might get lucky with a pic or two -- and some good risk thoughts to boot.

Nick advises clients on a range of governance, risk, and compliance (GRC) topics, including corporate culture, training and awareness, and corporate social responsibility. He presents at leading industry and technology conferences, and he works with organizations of all sizes across all major industries.



Wednesday, 15 July 2015 00:00

The New Needs of Digital Business

Digital business requires change across a very wide range of areas. There is an increasing use of storage, vastly expanded networking requirements, and a rise in the virtualization of all equipment. Digital systems deployed on the network can be replicated, modeled, and situated anywhere, so we have seen virtual networks, virtual servers, virtual mobile solutions, and virtual workstations of all types. Virtualization creates a need for new management techniques that control, replicate, and abandon virtual components on an automatic basis and manage their various interactions. Information technology is moving outside the firm to the public cloud, either directly or connected through a hybrid cloud mechanism. All aspects of IT are becoming increasingly connected to all the artifacts and processes of the firm.

The frameworks used in EA are also continuing to evolve and include elements such as big data, the cloud, mobile, and the other familiar elements of the changing environment. But what has not evolved so swiftly is the ability to rapidly change the models themselves and what they include as the cycles of technology change continue to accelerate. Continued development of digital business creates a space of massively interconnected data and processing, which must evolve into a more effectively governed system.



Given the complexity of managing IT environments these days, it’s now only a matter of time before machine learning is routinely applied to manage IT operations. One of the first companies to provide such a capability is SIOS Technology, which today announced the general availability of SIOS iQ software for VMware environments that applies analytics based on machine learning algorithms to both IT infrastructure and application software.

Available in both standard and free editions, SIOS Technology COO Jerry Melnick says the machine learning software first automatically discovers what should be defined as normal within any IT environment, and then over time learns what deviations from normal will result in a particular performance threshold being broken or potential vulnerability being created.

Melnick says SIOS Technology decided to focus initially on the VMware environment because of the size of the installed base, but the technology will soon be more broadly applied. At its core is an implementation of a Postgres database running machine learning software that IT organizations download onto a VMware virtual machine. Via a SIOS PERC Dashboard, SIOS iQ then recommends the best solution to any particular issue it discovers.



Tuesday, 14 July 2015 00:00

U.S. Winter Storm Losses Mount

As my kids head off for their snowy-themed day at camp, the statistic that jumps off the page in the 2015 Half-Year Natural Catastrophe Review jointly presented by Munich Re and the Insurance Information Institute (I.I.I.) is the record $2.9 billion (and counting) in aggregate insured losses caused by the second winter of brutal cold across the Northeastern United States.

As Munich Re illustrates in the following slide, a total of 11 winter storm and cold wave events resulted in 80 fatalities and caused an estimated $3.8 billion in overall economic losses in the period from January 2015 to the end of winter:



A new study from the Ponemon Institute confirms that most healthcare organizations have been the victims of cyber attacks, placing sensitive patient data such as Social Security numbers and insurance information in the hands of identity thieves and organized criminals. With more and more healthcare organizations turning to managed service providers (MSPs) and cloud-based file sharing to store and administer their substantial number of patient records, healthcare organizations’ third-party vendors are increasingly held responsible for complying with industry standards for data protection.

The Fifth Annual Benchmark Study on Privacy & Security of Healthcare Data investigated data breaches among 90 healthcare organizations and 88 of their business associates. Their findings show a shocking increase in cyberattacks and identity theft across the healthcare industry.



The majority of IT decision makers in large and midsize U.S. companies want to outsource their public cloud management to managed service providers, with 70 percent preferring to deal with a single vendor to manage their entire IT infrastructure, according to a new report.

Digital Fortress, a managed cloud and colocation provider with data centers in Seattle, surveyed 100 IT decision makers online in June. The company found that 65 percent of companies plan to partially outsource management of public cloud to a third-party.



U.S. Office of Personnel Management (OPM) Director Katherine Archuleta resigned last week after OPM officials discovered a data breach in April.

And as a result, OPM tops this week's list of IT security news makers, followed by the Army National Guard, Service Systems Associates (SSA) and "Gunpoder" malware.

What can managed service providers (MSPs) and their customers learn from these IT security news makers? Check out this week's list of IT security stories to watch to find out:



Are you a risky partner? According to a recent Skyhigh Networks survey, nearly 8 percent of cloud partners are given access to company data that is considered high-risk. For MSPs, it’s vital that your clients see your cloud-based file sharing services as a safe move for their company.

In order to effectively work with clients, you must work to show yourself as a low-risk partner, one that works hard to secure their cloud sharing for their other partners. The average company works with 1,500 business partners via the cloud. By first proving yourself as a trusted partner, you can then start working to protect your clients against the other 1,499.



On January 28, 1986, nearly 30 years ago, the space shuttle Challenger broke apart 73 seconds into its flight, leading to the tragic deaths of its seven crew members.[1] As the spacecraft disintegrated over the Atlantic Ocean, the paradigm of risk management shifted from reactive to proactive. Taxonomies, frameworks, methodologies and tools developed rapidly to meet this need to manage risk proactively. And while, nearly 30 years later, we are more confident through the evolution of risk management that has taken place to answer the reactive question, “Are we riskier today than we were yesterday?” we face the stark realization that we are not truly able to answer an even more important question: “Will we be riskier tomorrow than we are today?”

Realizing a collective vision to have informative dashboards that look forward, providing confidence in assessments of how risky things are that lie ahead, is the work of the current generation. That makes today an exciting time for risk management. Great progress has been made, but as we reflect today, we know so much more can and must be done.

At this point, we thought we would take a pause and look back 30 years on how risk management has evolved and some of the lessons we can draw from the past.