Spring World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Fall Journal

Volume 29, Issue 4

Full Contents Now Available!

Industry Hot News

Industry Hot News (6551)

IT disaster recovery, cloud computing and information security news

Ipswitch has announced the findings of its 2015 Wearable Technology Survey which polled IT professionals from businesses and organizations within the United Kingdom. The survey reveals concerns about wearable technology in the workplace.

Almost half (49 percent) of IT professionals are running networks that have smart watches connecting to them via Wi-Fi. 43 percent have fitness bands connecting, almost a fifth (17 percent) have health monitoring devices and 12 percent have recording and photography gear.

Only seven percent of all respondents say that their company provides wearable technology to its own workers. This is despite a quarter (25 percent) of IT professionals saying in a similar survey in October 2014 that they expected to introduce wearable technology within the next year.

The top concerns for IT professionals relating to high adoption of wearable technology in the workplace were:

1. Security breaches (71 percent)
2. More work to support more devices (48 percent)
3. Decreased network bandwidth (32 percent).

However, when asked if they had IT policies in place to manage the impact of wearable technology, over two-thirds (69 percent) did not and only one-fifth (21 percent) did have such a policy.

Read the full survey report here.

Continuity Central is pleased to announce that the winner of the Business Continuity Paper of the Year competition has been judged to be Ian Ross, FBCI.

Ian’s paper was entitled ‘A systematic approach to managing a crisis: the value that technology can bring to the crisis management environment’ and can be read here.

The other shortlisted papers can be read here.

Judging was carried out by a panel of FBCIs who considered three main criteria:

1)    Did the paper offer anything new to the business continuity body of knowledge?
2)    Did the paper offer practical and useful assistance to business continuity professionals?
3)    Would you consider the paper as ‘advanced level’ business continuity information?

Continuity Central has now launched its next Business Continuity Paper of the Year competition. The aim is to discover the best new business continuity articles and papers and a £500 or $800 prize will be presented to the winner.

Authors of any status, whether business continuity professionals, academics, students, or journalists, are invited to submit articles and papers written since 1st January 2015.

Entries must meet the following criteria:

  • They must have been written during 2015;
  • Copyright must be owned by the person submitting the entry and in submitting the article or paper the author gives permission for its publication;
  • Entries can be between 800 and 5,000 words long;
  • The subject matter of an entry can relate to any of the following topics: business continuity, disaster recovery, resiliency, crisis management, enterprise risk management, or technology continuity, resilience and availability.
  • Multiple entries from individual authors will be accepted.
  • Entries must be written in English.
  • The closing date for entries is 31st January 2016.

To submit an entry or request further information email editor@continuitycentral.com  Entries should be emailed as an attachment in any Word processing format or as an unlocked PDF. PowerPoint will not be accepted.

When it comes to securing businesses against data loss, key considerations may include reducing human error and preventing hacking intrusions into servers and databases. But one growing problem for firms both large and small may be the risk posed by distributed denial of service (DDoS) attacks.

This type of cybercrime involves criminals flooding a server with data requests in order to render it inaccessible to genuine users. It’s typically thought of as a way for hackers to knock a website offline or disrupt a company’s operations, but new research has found the collateral damage of these incidents could be far more wide-ranging.

As well as leading to long periods of downtime and high recovery costs, a study by Kaspersky revealed that more than a quarter of DDoS attacks (26 per cent) now also result in the loss of sensitive data.

The problem is particularly prevalent for less-sizeable firms, as 31 per cent of small and medium-sized businesses (SMBs) reported data loss in the aftermath of DDoS attacks, compared with 22 per cent of larger enterprises.

Evgeny Vigovsky, head of Kaspersky DDoS Protection, commented: “Businesses have to re-evaluate their perception of a DDoS attack. The report clearly shows that the damage scope from such attacks goes far beyond the temporary downtime of a corporate website.”

However, a large number of companies are still overlooking the potential risks of these incidents, with a common sentiment being that a mitigation strategy will be too costly and difficult to implement.

SMBs in particular have limited resources to devote to the problem, and as DDoS is an umbrella term that covers several different attack technologies, methods to avert them can be hard to understand. As a result, only around half of SMBs think investing in prevention solutions is worth the effort.

However, with SMBs typically paying upwards of $50,000 (£32,600) in recovery bills, and almost one in ten attacks causing up to a week of downtime in addition to potential data loss issues, the consequences of not preparing can be severe.

Complex data recovery requires expertise. Speak to the data recovery industry pioneers at Kroll Ontrack for free advice to investigate options to recover from any data loss type, system or cause.

From:: http://www.krollontrack.co.uk/company/press-room/data-recovery-news/data-loss-a-growing-side-effect-of-ddos-attacks,-study-says612.aspx

Hardly a day goes by without IT professionals hearing about some new horror story on how digital espionage is wreaking havoc throughout the world. Whether it is the hacker threat that grounded Polish Airlines or the cyber security issues boiling between the US and China. IT security is becoming a top concern across company boardrooms and parliaments alike. So, where does all this lead managed service providers (MSPs) and their cloud-based file sharing services? Undoubtedly, all the fear mongering is going to present a challenge in securing more prospect signups. Yet, provided you play your cards right, this just might be the biggest opportunity yet!



(MCT) - Lucy Jones, Southern California’s “earthquake lady” and a driving force behind Los Angeles Mayor Garcetti’s ambitious seismic safety plan, was awarded the Samuel J. Heyman Service to America Medal in Citizen Services, officials announced Wednesday.

Often referred to as the "Oscars" of government service, the “Sammies” recognize federal workers who have made a notable impact in the United States and around the world. Judges considered nearly 500 nominations and selected eight winners out of 30 finalists.

Jones, who joined the U.S. Geological Survey in 1983, is recognized across Southern California for her research and ability to explain earthquakes to the general public.



(MCT) - Recent heavy rains and floods across South Carolina that broke multiple dams and destroyed hundreds — if not thousands — of homes have turned a spotlight on the state’s dam safety program.

South Carolina has for years had one of the nation’s weakest dam safety programs, consistently ranking near the bottom of rankings in federal and state government reports.

In 2013, the state spent less than $200,000 on its dam safety program, employing a handful of people devoted specifically to inspecting and regulating the structures. That’s roughly the same amount the state spent on the program in 2010, when a national report rated South Carolina 45th nationally in financial resources committed to dam safety.

Lori Spragens, executive director of the national Association of State Dam Safety Officials, said resources for inspecting the state’s dams remain low in South Carolina. All told, South Carolina has 2,300 dams, most of them privately owned and made of earth.



(MCT) - York County Office of Emergency Management on Tuesday unveiled a new way to contact them in an emergency and new software to make their response better prepared.

The Text-to-911 system first detailed in August 2014 and the Smart911 emergency profile database signed onto by the county in August 2015 will both come online Wednesday.


In an emergency, residents will now be able to text 911 to reach a dispatcher, though the office stresses that voice calls are still preferred.



Additional data, animation features added to popular coastal product


NOAA's nowCOAST web portal, shown here with a prediction of surface water currents, has 60 layers of data the user can click on or turn off. (Credit: NOAA).

NOAA's nowCOAST web portal, shown here with a prediction of surface water currents, has 60 layers of data the user can click on or turn off. (Credit: NOAA).

NOAA has upgraded nowCOAST, a GIS-based online map service providing more frequently updated ocean observations along with coastal and marine weather forecasts. The new version, which went live on September 21, also offers a visual point-and-click access to 60 NOAA data products and services. Users can reach the site at nowcoast.noaa.gov.

“NOAA’s nowCOAST gives the public a one-stop-shop look at coastal conditions — real-time and forecast — before they do or plan anything on the water,” said Rear Admiral Gerd Glang, director of NOAA’s Office of Coast Survey. “Are you sailing? Look at the winds and currents. Are you a commercial shipper? Get your high seas marine weather forecast, on the same animated map where you can check the tides before you approach your port.”

The original version of nowCOAST, available since 2003, has provided the public with information on the latest observed and predicted coastal weather, marine weather, and oceanographic and river conditions. The updated map viewer allows users to animate observations for the past four hours and forecasts for the next seven days.

The new version also adds significant data from NOAA’s National Ocean Service and National Weather Service, including watches, warnings and advisories for hazardous marine weather conditions, even far offshore. It also provides near-real-time lightning strike density data for land and over water, and hydrologic conditions and predictions from ocean forecast models.

“The new time-enabled map services go beyond traditional navigation uses,” said Luis Cano, director of the NWS dissemination office. “For instance, during coastal storms, emergency managers are now able to overlay National Weather Service watches, warnings, and forecast products on top of critical infrastructure and evacuation maps, for better response.”

NowCOAST is an ArcGIS-based web mapping application developed by the Office of Coast Survey’s Coast Survey Development Laboratory, with technical assistance and IT support from National Weather Service’s National Centers for Environmental Prediction.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.



Is resilience really the next big step forward for the business continuity profession? Betty A. Kildow, FBCI, CBCP, attempts to separate the hype from the reality when it comes to this controversial subject.

There are great time demands on business continuity professionals who are developing and managing programs, often to the extent that we seldom have time to stop and consider the bigger picture of where our profession stands, where we are going, and the relevancy of new developments and trends.  A case in point is the increasing interest in resilience and its relationship to business continuity management programs. 

This article is a combination of facts, opinions, and musings on the condition of BCM and also resilience, written from one person's perspective with the hope that it will initiate thought, reflection, and discussion of these two related topics. 

Things change, and generally speaking, that is a good thing. Quoting Bertrand Russell, "In all affairs it's a healthy thing now and then to hang a question mark on the things you have long taken for granted."  W. Edwards Deming made an even stronger call for change, "It is not necessary to change. Survival is not mandatory."  Over the thirty-year history of business continuity (previously business recovery) we have seen significant changes and improvements as our profession has evolved, as we have risen to the challenges of increased requirements and a growing list of risks and threats.



In a blog earlier this year, I sounded an alarm about the dangers of investing in companies with no internal audit functions. Ultimately, the goal was to raise awareness of the risks that accompany the absence of internal audit in publicly traded companies.

That effort took an important step forward in September when The Institute of Internal Auditors formally recommended to the U.S. Securities and Exchange Commission that all publicly traded companies be required to have an internal audit function.

There have been a number of high-profile financial and corporate governance scandals of late that should hammer home the absolutely necessity of good corporate governance, and it should go without saying that internal audit adds value to that process by providing effective oversight of the control environment.



Thursday, 08 October 2015 00:00

Local Colleges Always Prepared for the Worst

(MCT) - It happens within seconds.

One moment the professor is lecturing about any given topic and the next moment an alert on a cellphone reads there is an active shooter on campus.

It’s a scenario that students keep in the back of their minds, many on college campuses.

It’s also a situation that became real life for students at Umpqua Community College in Rosebury, Oregon, when 26 year-old Chris Harper Mercer killed nine people before killing himself and injuring seven more last Thursday.

It’s situations like these that the Monroe County higher education institutions prepare for, go over frequently, and have rules to protect their students from.

To start, carrying concealed weapons is not allowed on either campus, according to the spokeswomen.



Thursday, 08 October 2015 00:00

Flood's Aftershocks Rock Columbia, S.C.

(MCT) - The aftershocks from the weekend’s historic rain storm continued to shake the Columbia area Wednesday with more dams breaking, more flooding fears and more people dying.

Two employees of a Kentucky-based company who were repairing railroad tracks damaged in the storm died early Wednesday after their vehicle became submerged in flood waters in lower Richland County.

The bodies of Robert Bradford Vance, 58, of Lexington, Ky., and Ricky Allen McDonald, 53, of Chesapeake, Ohio, were pulled from the vehicle in Cedar Creek, Richland County Coroner Gary Watts said.

Vance and McDonald were traveling with three co-workers from a job site about 3 a.m. Wednesday when their vehicle drove through a barricade, near the 2100 block of Congaree Road, and fell into the creek, which had washed out the road, Watts said.



Wednesday, 07 October 2015 00:00

How Soon to Full Cloud Dominance?

The cloud is putting up some big deployment numbers in recent months, leading many analysts to ponder not whether it will become the dominant form of IT, but when.

According to IDC’s most recent forecast, the total cloud infrastructure spend will top $32.6 billion this year, a 24.1 percent increase over 2014. This includes the server, storage and Ethernet switch markets without even double counting server/storage deployments as additions to both servers and storage. In total, this accounts for about a third of all IT infrastructure spending, up from about 28 percent last year. Even more telling, though, is the 1.6 percent drop in non-cloud spending, which indicates that even money going into legacy data centers is being earmarked for private and hybrid clouds.

Also interesting is that this is happening at a time when there are still some serious roadblocks when it comes to enterprise cloud adoption. As Bracket Computing’s Navneet Singh noted recently, security, control and performance consistency remain the largest drawbacks. Practically every day, however, these issues are becoming less intractable as hybrid infrastructure, unified management stacks, software-defined networking and a host of other advancements make it easier to run multiple cloud deployments as a single data ecosystem.



Newly minted Vice President and Principal Analyst, Rick Holland, is one of the most senior analysts on our research team. But for those of you who haven’t had the opportunity to get to know him, Rick started his career as an intelligence analyst in the U.S. Army, and he went on to hold a variety of security engineer, administrator, and strategy positions outside of the military before arriving at Forrester. His research focuses on incident response, threat intelligence, vulnerability management, email and web content security, and virtualization security. Rick regularly speaks at security events including the RSA conference and SANS summits and is frequently quoted in the media. He also guest lectures at his alma mater, the University of Texas at Dallas.

Rick Holland Image

Rick holds a B.S. in business administration with an MIS concentration (cum laude) from the University of Texas at Dallas. Rick is a Certified Information Systems Security Professional (CISSP), a Certified Information Systems Auditor (CISA), and a GIAC Certified Incident Handler (GCIH).

Check out this week’s interview with Rick for his take on the threat intelligence market and on a  few innovative companies in the spaces that he covers.



Wednesday, 07 October 2015 00:00

Trends in Travel Risk Management

Managing health, safety and security risks to workers on international travel and assignment is the subject of a new paper from the Federation of European Risk Management Associations (FERMA) and International SOS.

A survey earlier this year led by FERMA and International SOS, found that travel risk management is on the agenda for 79 percent of the risk and insurance managers polled.

The document identifies the risk manager as a pivotal influencer in evaluating effective travel risk management solutions. As noted in the paper: "The risk manager's holistic perception of the medical, security and insurance aspects is critical to considering efficient solutions and practical responses to any situation an organization might face when sending workers abroad." 

The paper includes:  

  • A legal review of duty of care for organizations in Europe;  
  • Best practices and practical experiences from leading risk practitioners;
  • A travel risk management toolbox that outlines health and travel security measures that organizations can implement to help reduce risks for their travellers and international assignees;
  • A review of the transposition of the EU legislation in 15 Member States which shows that national laws vary always toward greater health, safety and security responsibilities for organizations towards their workers.

Read a copy of the paper here.

The complexity of today’s business environment threatens to overwhelm the compliance function in many organizations as they struggle to respond to questions from regulators, executive committees and Boards. Unfortunately, one common panacea for organizational complexity—technology—has not won an overwhelming number of supporters in the risk and compliance space. According to a recent survey Deloitte conducted with Compliance Week, only 32 percent of compliance executives were confident or very confident in their IT systems, a rate that has actually dropped from 41 percent since the survey was conducted in 2014. This may be why the majority say they primarily depend on desktop software and in-house tools such as spreadsheets to perform most compliance tasks. Reliance on these tools is one reason many compliance functions tend to spend the preponderance of their time gathering data rather than analyzing it.

One technology solution that has begun to have an impact in the compliance space is the governance, risk and compliance (GRC) tool set. While not perfect, these tools have improved enormously over the past five years and have the potential to automate such activities as data collection, control testing, issue management, workflow and reporting. As with any tool set, implementation of appropriate governance processes and procedures are critical to overall success.

Experience gathered while working with compliance professionals on numerous GRC initiatives has led to the identification of five critical success factors:



EATONTOWN, N.J. - Among the most devastating effects of Superstorm Sandy in New Jersey was the storm’s impact on sewage treatment facilities along the coast.

During and after the storm, sewage plants and pump stations were inundated by flood waters and without power for as long as three days, resulting in the discharge of some two billion gallons of untreated and partially treated sewage into New Jersey waterways (New York Daily News, 4/30/2013).

The environmental damage was unprecedented – and the financial impact was devastating.  Total costs to repair and reconstruct the damaged sewage treatment facilities now top more than $100 million.

With the help of FEMA Public Assistance Grants, sewage treatment authorities throughout the state have acted to reduce the risk of a similar disaster through mitigation measures that include constructing flood walls, elevating sensitive equipment, and relocating vulnerable facilities out of the flood zone.

In southern Monmouth County, the Southern Monmouth Regional Sewerage Authority owns, operates and maintains 11 sewage pump stations in Belmar, Brielle, Lake Como, Manasquan, Sea Girt, Spring Lake, Spring Lake Heights and Wall Township. FEMA has obligated more than $5.3 million in federal funding for the Southern Monmouth Regional Sewerage Authority to date

The majority of the Authority’s sewage pump stations were constructed and placed into operation in the 1970s. But in Sea Girt, the authority converted an existing facility constructed in the 1900s. By 2006, that facility had outlived its useful life and the Authority made plans to replace it. The Sea Girt pump station had been flooded in the past, and the likelihood was high that it would experience repeated flooding.

While the Authority considered relocating the facility, that idea was not feasible because of the cost, permitting restrictions and the lack of available land in the heavily residential community.

Instead, the project team comprising Authority officials and project engineers worked together to design a facility that could remain within the footprint of the old plant but that would be better equipped to function and survive during a major storm.

View of the outside of the mobile trailer that houses the Sea Girt Pumping Station's most sensitive equipment.
A mobile trailer houses the Sea Girt Pumping Station's most sensitive

The plan they decided upon called for a mobile trailer for the pumping station’s most sensitive equipment. The trailer can be moved out of harm’s way when flooding threatens.

The enclosure consists of two rooms, one sound-attenuated room for the emergency generator and another climate-controlled room for the electrical equipment, including controls, alarm systems, variable speed drives and various other components. Electrical and control connections between the enclosure and the pump station and its equipment are made with cables and plugs that can be opened to permit removal of the enclosure.

The trailer can be removed when emergency management officials notify the Authority of an impending storm.

When the trailer is removed, an expendable portable generator and transfer switch is put it in its place, allowing the pump station to operate even when utility power is lost. A secondary, sacrificial electrical and control system, permanently mounted on the site, powers the pumps and other equipment on utility or generator power.

Once the storm subsides, the mobile trailer can be moved back into place and put back on line. The mobile trailer plan minimizes any damage to the station’s electrical equipment and significantly reduces downtime for the station.

The station is then able to return to normal operation within hours of the passing storm, rather than days, weeks, or months, thus reducing the public health risk that can result when untreated sewage is discharged into waterways.

An outside view of the portable trailer that houses an emergency generator in Sea Girt pump station.
Portable trailer houses an emergency generator in Sea Girt pump station before Superstorm Sandy struck. R. Arias

The Sea Girt pump station is also in harmony with Governor Chris Christie’s goal to make New Jersey energy resilient and is considered a model for Best Management Practice for sewage and water authorities, enabling continuous operation during adverse weather events, thus eliminating or substantially reducing the potential for an environmental disaster caused by the release of untreated sewage.

As a result of the steps the Authority took to mitigate the facility, the Sea Girt Pump station withstood the assault by Hurricane Sandy, a 100-year storm.

Today, the Authority is implementing the mobile trailer plan at its Pitney pump station and will relocate its Spring Lake station outside of the 100-year flood zone, preventing a repeat of the environmental damage and expense that occurred as a result of Sandy.

See related video: http://www.fema.gov/media-library/assets/videos/86134

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Follow FEMA online at www.twitter.com/FEMASandy,www.twitter.com/fema, www.facebook.com/FEMASandy, www.facebook.com/fema, www.fema.gov/blog, and www.youtube.com/fema.Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.”



If IT professionals are to maintain their relevance as influential, strategic leaders in an organization, they need to do a lot more than just ensure that the lights are kept on—they need to be drivers of innovation. And the best way to accomplish that just might be to crowdsource it.

That’s the message being advocated by James Gardner, CTO of Mindjet, a mind mapping and innovation management software provider in San Francisco. I recently had the opportunity to speak with Gardner in his capacity as the technology leader of Spigit, which was acquired by Mindjet in 2013 to become the innovation management software arm of Mindjet. Gardner opened the conversation by explaining Spigit’s role as a platform for crowdsourcing idea generation to drive innovation:



Wednesday, 07 October 2015 00:00

MSPs: How to Educate Non-IT Staff

In many instances, the IT world seems out of reach to people beyond its realm. To most people, it is a place with its own lingua-franca, rules, and perhaps even extremely elaborate secret handshakes. Now, while the difference between the techie and the sales rep has provided us with many a laugh over the past few years, we are actually beginning to see damages resulting from this fumbling. Cloud-based file sharing services such as yours are perceived as part of the tech macrocosm that automatically draws an invisible barrier between you and your non-tech clients – an unnecessary barrier which you must overcome before their patronage can be won.

For instance, a recent report by BH Consulting reveals that most non-IT professionals simply have no idea about data breaches and other information security threats. This even encompasses some of the most popular attacks, such as the one against Sony and the Heartbleed vulnerability.



A corn crop in Arkansas is stunted and sparse due to drought conditions. (Credit: USDA NRCS Photo Gallery, Tim McCabe).

A corn crop in Arkansas is stunted and sparse due to drought conditions. (Credit: USDA NRCS Photo Gallery, Tim McCabe)

NOAA’s Climate Program Office (CPO) today announced it has awarded $48 million for 53 new projects. Research will be conducted by NOAA laboratories and operational centers, universities, and other agency and research partners to advance the understanding, modeling, and prediction of Earth’s climate system and to improve decision making.

The results of research funded by these grants are expected to have impacts far beyond individual projects. Some of the anticipated results include more accurate weather and climate prediction, early warning of drought hazards, more robust decision-support services, enhanced community and drought preparedness, and improved ability to respond and adapt to public health impacts.

States with institutions receiving NOAA CPO funding from the FY2015 Competition.

The funds will be distributed over the life of the projects, many of which span one to five years. All awards were selected through an open, highly competitive process.

"Every day, communities and businesses in the U.S. and around world are grappling with environmental challenges due to changing climate conditions and extreme events," said Wayne Higgins, director of the Climate Program Office. "People want timely and relevant scientific information about where and why climate variability and change occur and what impacts that has on human and natural systems. CPO's competitive grants play a vital role in advancing understanding of Earth's climate system and in transitioning our data, tools, information, and operations to applications the public can use to improve decision making.”

Great Lakes Regional Sciences and Assessments (A RISA program) hosts workshop in St. Paul, Minnesota connecting local governments with climate adaptation science. (Credit: With permission from Daniel Brown).

Great Lakes Regional Sciences and Assessments (A RISA program) hosts workshop in St. Paul, Minnesota connecting local governments with climate adaptation science. (Credit: With permission from Daniel Brown).

  • The projects will support these priorities:
  • Provide high-quality, long-term global observations, climate information and products, $5.1 million for projects to produce global and regional indices to help monitor climate, weather, and sea ice trends, which provide information to forecasters, researchers, and decision makers in communities across the country.
  • Provide leadership and support for research, assessments, and climate services to key sectors and regions, $24.4 million — including $19.5 million for Regional Integrated Sciences & Assessment Programs from Hawaii to New York — to improve the ability of local communities to prepare for and adapt to climate change. 
  • Improve critical forecasts and bolster earth system models, $10.2 million to improve predictions and projections on a range of time scales from weeks to seasons, to decades, and centuries in the future.
  • Improve prediction of drought and other extreme events, $8.4 million to improve earth system models and predictions through the North American Multi-Model Ensemble System (NMME), a state-of-the-art seasonal prediction system, and help fund the creation of a new task force and improved software infrastructure for NOAA weather and climate models.
A farmer in the Midwest struggles with drought conditions. (Credit: Climate.gov and U.S. Climate Resilience Toolkit photo).

A farmer in the Midwest struggles with drought conditions. (Credit: Climate.gov and U.S. Climate Resilience Toolkit photo)

CPO manages competitive research programs that fund climate science, assessments, decision-support research, modeling improvements, and capacity-building activities. While each program has its own focus, together they demonstrate NOAA’s commitment to advancing integrated climate research and enhancing society’s ability to plan and respond to climate variability and climate change. CPO’s network of partners, specialists, and principal investigators will broadly integrate research findings from these projects to help build resilience in the face of climate challenges. 

A full list of awards, as well as individual announcements for each program, is available online.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.



(TNS) -- Hurricane Joaquin will give federal officials a chance to test a new system designed to provide real-time information about water conditions on Long Island and beyond during a storm, allowing emergency agencies to respond more quickly.

The U.S. Geological Survey's Surge, Wave and Tide Hydrodynamic Network reaches from Virginia to Maine, with dozens of sites on Long Island.

While Joaquin is expected to veer east of Long Island, the agency plans a scaled-back deployment of the system on Long Island, said Ron Busciolano, supervisory hydrologist with the USGS New York Water Science Center in Coram.



National Cybersecurity Awareness Month certainly started with a bang, and not in a good way. My inbox was blowing up on Friday afternoon with alerts about the Experian breach involving T-Mobile wireless customers, and before I could catch up on that news, the emails shifted direction to the Scottrade breach. Today, as I was searching for more information about the breaches, I saw an announcement that the American Bankers Association’s website was hacked.

Even in this breach heavy (and weary) world, that’s a lot of bad news all at once. In fact, this comment that Ryan Wilk, director, customer success, NuData Security, sent to me seemed to sum up the news of the past few days quite well:

Data breaches don’t occur in a vacuum. The repercussions are widespread and often have a ripple effect.



They last a lifetime and they never change. Fingerprints, irises and even gaits (as in walking) are immutable, if you discount the use of surgery. That is what makes them such reliable identifiers and the basis of different biometric security systems. From science fiction and spy films, we now have smartphones (iPhones for example) that have integrated fingerprint recognition. Users no longer have to remember or reset those ID/PIN combinations. Yet recently, hackers recently stole a file with 5.6 million fingerprints of US government employees. And of course, unlike ID/PIN combinations, those fingerprints cannot be reset. Now what?



A proper understanding of risk tolerance is one of the key factors that can help the risk function demonstrate that it is much more than a cost centre, and truly adds value to the organization. This is according to John Merkovsky, Head of Willis Risk & Analytics, writing in the seventh edition of Resilience, the leadership journal from Willis Group Holdings. 

For risk professionals, there is no more important consideration than understanding the amount of risk an organization is a) able to take, b) willing to take and c) desires to take, according to Merkovsky in his article entitled ‘Risk Tolerance: The Risk Manager's Compass’.

The paper explains that a proper understanding of risk tolerance can help organizations in a number of ways. It can, for example, afford a deeper understanding of whether or not the organization is adopting the desired level of financial protection. Additionally it can help the risk function understand whether risk transfer is supporting the organization's overall strategic goals. "To make better decisions about insurance, an organization's risk tolerance needs to be reflected," said Merkovsky.

Merkovsky goes on to say that despite the benefits, risk tolerance is rarely engrained in risk management processes and structure. This is because the concept is often difficult to apply in practice and the nomenclature is not used consistently across the industry. Moreover, executives within the same organization often have very different views on the level of risk the organization should be willing to take.

Merkovsky commented: "This unsettled environment presents a terrific opportunity for a truly strategic risk manager to lead. But first, a risk manager needs to be able to demonstrate the value accretion that a well-defined view of risk tolerance can add to decision making."

He added: "Many organizations are looking to advance their thinking about their approach to risk tolerance yet they lack the consistent nomenclature, tools and focus to do so. Risk managers are well positioned to provide leadership here. Their experience in thinking across a broad range of risk topics and doing so in both financial and organization terms is unique in most organisations.

"And, if leading an organizational initiative on risk tolerance is not for every risk manager, it is still a great opportunity to ensure that their own insurance and risk management activities are built with a clear alignment of organizational goals. In this way it will be clear to senior management and other risk stakeholders that the risk management function is much more than a cost centre, and truly adds value to the organization."

Read the complete article.

(MCT) - Even those who went through Hurricane Hugo a quarter-century ago said they had never seen anything like this, the deadly torrents that crumbled roads, submerged houses and cars and killed at least 12 people — 10 in South Carolina and two in North Carolina.

“They’re saying it’s a once-in-1,000-year rainstorm, and I’m inclined to believe it,” said Sean Brennan, a real estate broker who had just checked on a colleague’s house in South Carolina’s capital, Columbia.

“It looked like a river ran through it,” Brennan said.

Even though the house was built 4 feet above ground, the water came up nearly 2 feet into the garage, he said. The backyard was a lake.



In a quest for improved efficiency, higher performance and maximized storage utility, organizations operating in increasingly demanding IT environments continue to deploy expensive proprietary products or ineffective heterogeneous hardware/software solutions to gain the necessary IOPs and/or capacity required for their particular compute needs.  Companies can better maximize their storage investment while outperforming alternative solutions by following these five tips:

1.	Ditch the “legacy” equipment – Too often companies look to save money by repurposing their existing data center hardware using software-defined solutions that are then supposed to maximize performance and capacity lacking in their current infrastructure.  The problem is that while software solutions look to optimize how data flows through any particular hardware architecture, the results will be less than if the hardware itself is optimized to provide the highest speeds and peak performance, without the reflections, throughput bottlenecks and signal loss inherent in underperforming legacy solutions.

2.	Avoid vendor lock-in – Purpose-built solutions in which both hardware and software are designed by a single company to work in perfect harmony can be an ideal solution for organizations needing the performance oftentimes lacking in heterogeneous environments.  However, the costs of such solutions are often prohibitive, with expenses incurred from purchasing the hardware and the ongoing software licenses, maintenance contracts and upgrade expenses.  Implementing an Open Storage Platform allows customers to integrate hardware in any configuration, with the software solution of their choice to maximize flexibility while minimizing costs.

3.	Adopt a hardware storage platform that complements the software solution of choice – Whether using open-source software or proprietary software-defined storage solutions, a company’s use case should determine which protocol and “flavor” of storage they implement.  Whether they need DAS, NAS, SAN or other protocols, a mechanically flawless hardware architecture that overcomes software incompatibilities is a necessity to satisfy the ruthless IO requirements of cloud storage, big data analytics, HPC, enterprises and remote sites.

4.	Mitigate the “either or” dilemma of choosing between performance or capacity – By implementing a hardware solution that increases the storage density of both SSD and SAS/SATA storage solutions, companies can gain the benefits of both traditional Tier 1 and Tier 2 storage in a flexible, customizable and fully scalable single storage solution.  Through the combination of storage and compute resources in a single storage solution, organizations can cut data center space requirements, while increasing performance at a lower TCO compared to disparate systems. 

5.	Implement hardware that utilizes top-of-the-line components from audited suppliers – State-of-the-art software solutions need state-of-the-art storage servers built to the highest specifications for speed, performance and reliability.  By using top-of-the-line components from audited suppliers rather than cheaper mass-produced parts to omit cross talk, packet loss and power jitter, companies can assure themselves maximum throughput and reliability not available in standard, off-the-shelf hardware products.

Choosing a storage solution is a business-critical decision for many organizations faced with the capacity and performance needs of today’s data-hungry environments. Storage incumbents continue to offer the same hardware configurations regardless in improvements in media or software available, claiming that their equipment is capable of handling the changes.  The reality is that only by implementing a high-performance storage server system designed to maximize the utility of software solutions can organizations truly meet the demanding storage needs of their particular industry.

Designed without the inherent physical bottlenecks or software incompatibilities of other storage products, SavageStor is an all-in-hardware (server, networking and storage) solution that satisfies the ruthless capacity and IO requirements of cloud storage, big data analytics, HPC, enterprise and ROBO environments.  
Monday, 05 October 2015 00:00

Deadly Places To Place Portable Generators

By Lisa Kaplan Gordon

Portable generators are a godsend when a storm kills your power or your RV needs some juice to keep food cold.
But portable generators, if not operated or placed correctly, can be a curse, too. Carbon monoxide, an odorless and invisible killer found in fuel emissions, can lull you into a permanent sleep. In fact, carbon monoxide exposure is the chief cause of death due to poisoning in the U.S., according to the New York State Health Department; carbon monoxide from portable generators caused 800 U.S. deaths from 1999 through 2012.
Carbon monoxide is insidious and can sneak into your home through windows cracked a smidge to accommodate extension cords, under entry doors, and into HVAC vents and pet doors.
I wish I had known that when a freak storm battered our Virginia home a few years ago knocking out power for days. I purchased my first generator and dutifully placed it 10 feet from the house. What I didn’t do was close our garage door, where extension cords snaked into the house, or side windows, which we opened to exploit a rare breeze.
The generator could – and probably did – send carbon monoxide fumes into the house; we were lucky that levels didn’t build and sicken or kill us.
Take home lesson: Never run a portable generator in risky places, like the ones below.
Indoors: Don’t even think about running a portable generator inside, even if you throw open windows for increased ventilation, which will not protect you against deadly carbon monoxide accumulation. Inside includes garages, crawl spaces, attics, and basements. To be extra safe, install a battery-operated carbon monoxide detector/alarm or a plug-in detector with battery backup, which can alert you to rising levels of the deadly gas. Some home security systems include a carbon monoxide detector that will alert you and its monitoring station of rising gas levels.
Outdoors Near Openings: Even parts of the outdoors are unsafe places for portable generators. Unfortunately, just how far your generator should to be from doors and windows is debatable. Some authorities say place the generator 10 to 15 feet from the house. However, wind direction, house and generator particulars all affect how much carbon monoxide could seep into your house. New research from the National Institute of Standards and Technology indicates that at least 25 feet from a house is a safer distance. Wherever you put the generation, make sure 3 or 4 feet of space surrounds it to ensure proper ventilation.
Wet Weather: It’s ironic: Wet weather makes you need a portable generator; but you should never run portable generators in wet places, which could cause electrocution. The solution is placing the generator under an open-sided shelter or covering it with a GenTent canopy, which will keep it dry.
In/Near a Vehicle: You cannot operate a portable generator safely in an enclosed vehicle or even nearby. When tailgating, keep the generator as far away as possible, and direct exhaust away from you and your neighbors.

Portable generators are a great source of emergency power supplywhen and where you need it. But they can also be a health hazard if not properly operated or placed. Just be careful to place generators in open areas and away from your home to prevent carbon monoxide fumes from seeping into your house and causing harm or death.



As cyberrisks evolve, enterprises have begun to focus on the insider threat by adding specialized capabilities for behavioral analytics on top of endpoint and network monitoring. In order for these tools to be most successful, there must be a fundamental understanding of the role an insider plays in a breach. Not every employee-caused breach is malicious, but they certainly are numerous. In fact, according to Verizon’s most recent Data Breach Investigation Report, 90% of breaches have a human component, regardless of intent.

Insider threats are a rampant problem exemplified by several recent headline-making incidents: the indictment of six Chinese nationals on suspicion of stealing intellectual property worth millions from two U.S. technology firms; accusations from financial giant Morgan Stanley toward an employee believed to have stolen client information with the intent to sell it; and claims from wearable-maker Jawbone that its competitor Fitbit regularly courted its privileged employees, enticing several of them to switch companies and bring sensitive details on its products. The uncertainty around all of these cases begs a couple of important questions: how can intent be determined, and how can employee privacy be maintained while ensuring business security?



I frequently help Forrester clients come up with shortlists for incident response services selection. Navigating the vendor landscape can be overwhelming, every vendor that has consultant services has moved or is moving into the space. This has been the case for many years, you are probably familiar with the saying: "when there is blood in the water." I take many incident response services briefings and vendors don't do the best job of differentiating themselves, the messages are so indistinguishable you could just swap logos on all the presentations.

Early next year, after the RSA Conference, I'm going to start a Forrester Wave on Incident Response services. Instead of waiting for that research to publish, I thought I'd share a few suggestions for differentiating IR providers.



Monday, 05 October 2015 00:00

A Cyber Security Confession

I’m going to hold my hands up right now and tell you that as resilience professional in 2015 I still feel like I know very little about cyber security and it really concerns me.

I was recently listening to a very interesting discussion during an interview with Ken Simpson and the wonderfully insightful Lyndon Bird (a guy who I’m constantly asked if he’s my father because of our similar name) on the Beyond the Black Stump Podcast Series (I highly recommend a listen) where Lyndon, who is often described as one of the founding fathers of BC, touches on a point that I’ve been contemplating for a long time. In summary he says…

“Has business continuity gone through its lifecycle of conventional Business Continuity Management Systems into a wider arena called resilience and are our traditional skills ready for that?…Business continuity has a limitation in so far as where it goes to next…Cyber to some extent doesn’t fit our model.”



Creative abstract mobility and digital wireless communication technology business concept: group of tablet computer PC and modern touchscreen smartphones or mobile phones on wooden table

By: Sarah Leary

Online communication and social networks are changing the way that people communicate. Today, people are able to relay messages to those around them and those across the world nearly instantly. This instant communication is playing a critical role in emergency communication.


When the largest earthquake since 1989 hit Napa, California, and the greater San Francisco Bay Area in August 2014, neighbors and local agencies were quick to turn to social media to communicate updates and information about the damage and safety precautions. One of the social networks utilized was the private social network for neighborhoods, Nextdoor, which creates social networks and communication channels specific to individuals’ neighborhoods.

Within minutes of the earthquake, residents used Nextdoor to send urgent alerts out to their communities, warning their neighbors to take cover in doorways, watch out for crumbling chimneys, and keep an eye out for scared and flighty pets. In the days following the quake, neighbors continued to use this new social network to share neighborhood-specific tips on clean-up efforts, offer shelter to neighbors in need, and report sightings of lost pets in the area.

Several Nextdoor agency partners, including both the City of Napa and the City of American Canyon, used social media to inform residents of damages, advice for contacting emergency personnel, school closures, and more. In many areas, social media was used to advise residents to keep an eye out for the sound or smell of leaking gas lines and provided road closure updates.

An incredible number of social media conversations in the greater San Francisco Bay Area that day were related to the earthquake– demonstrating that a connected community is indeed a stronger community. Neighbors connected with neighbors, passing along the latest information on power outages, road closures, and damage reports.

Similarly, during the flash flooding and historic rainfall in Houston, Texas this May, the Houston Office of Emergency Management also turned to social media to send out important safety updates and urgent safety alerts to residents across the city.

“During times of emergency and natural disasters, it is often neighbors who are able to best help each other,” said Rick Flanagan, Emergency Management Coordinator at the Houston Office of Emergency Management. Social media “has played a vital role in, not only helping our residents connected, but giving us an effective way to work directly with residents to make Houston a more resilient, prepared city.”NextDoor_UrgentAlert

The ability to connect with the community online rapidly closed the communication gap that previously existed between residents and emergency services.

For towns that have experienced more than their fair share of natural disasters, like the City of Moore, Oklahoma, which has been plagued by tornadoes, social media platforms offer a way to connect communities and increase resiliency.

“The more connected you are to your neighbors, friends, and family, the more invested you are in your community. We have people that have gone through disaster and destruction and they have chosen to stay,” said Jayme Shelton, marketing specialist for the City of Moore. “I think Moore citizens choose to stay because of the people.”

Shelton noted, “We come together as a community during times of disaster, and it would be great if we kept that going throughout the year. We don’t have to have a disaster hit us to know your neighbors.” Social media platforms play a big part in connecting neighbors, community leaders, and emergency management resources.

In 2010, the Pew Research Center released a report stating that 28 percent of Americans do not know a single neighbor by name, and only 29 percent know one neighbor by name.

Social media has enhanced how public agencies and residents work together to build more resilient communities. Public safety agencies across the country are increasingly combining the power of social networks with the power of connected neighbors to help create safer more resilient communities – whether the emergency is a flooding in Texas, an earthquake in California, or a tornado in Oklahoma.

If neighbors are able to be better connected, they will be much more resilient and prepared for anything that comes their way.

Nextdoor's icon a white house in a green boxSarah Leary is the Co-Founder and Vice President of Marketing and Operations at Nextdoor, a free and private social network for neighborhoods.


Monday, 05 October 2015 00:00

BCM & DR: Mergers & Acquisitions (Part 1)

As many of you may know, I work in Program and Project Management, as well as Business Continuity and Disaster Recovery. I find the Program/Project Management aspects help build and manage activities needed in BCM & DR and communicate buy-in and need with executives. If you haven’t had any Project Management training, I suggest you attend a course (Note to self: New Post about Project Management). So, it came as something interesting the other day when during a program meeting, the topic of a merger and acquisition with regards to BCM & DR came up during a meeting – and not at my urging either.

If you work for a large corporate entity, you may have gone through a merger/acquisition – as the either purchaser or the one who was acquired. If you work in the IT or DR/BCM role, then you’ve probably had some hair pulling moments trying to figure out how new – or old – technologies work and how they need to work together in the event of a disaster. But it’s doesn’t have to be that difficult…at least if the newly acquired company will still operate as a ‘separate entity’.



DENTON, Texas – More than $5.6 million in federal funding was recently awarded to the state of Louisiana to fund wind damage and flood protection measures in Jefferson and Terrebonne parishes.

In Jefferson Parish, more than $2.8 million covers mitigation measures taken to protect government facilities such as fire headquarters and the police department from wind and debris damage. The measures include 571 impact-resistant screens and roll-down shutters.

In Terrebonne Parish, more than $2.8 million pays for the elevation of 23 storm-damaged properties to one foot above the 100-year flood level. This significantly reduces the effect of future flooding on those structures.

The funding for these projects originates from the Federal Emergency Management Agency’s (FEMA) Hazard Mitigation Assistance (HMA) grant programs. HMA, specifically the Pre-Disaster Mitigation program, provides funds for hazard mitigation planning and projects that reduce disaster losses and protect life and property from future damages. For more information on HMA, visit http://www.fema.gov/hazard-mitigation-assistance.

FEMA’s contribution represents a 72 to 75 percent federal cost share. FEMA awards funding for projects directly to the state of Louisiana; the state then disburses the grant to the eligible applicant.

Follow FEMA Region 6 on Twitter at https://twitter.com/femaregion6.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Using business continuity management to protect against data breaches

Organizations that involve their business continuity management teams in data breach planning and response can reduce the likelihood of data breach and lessen the cost and impact of any breach that should occur. These findings were uncovered in the 2015 Cost of Data Breach Study: Impact of Business Continuity Management, sponsored by IBM and conducted by the Ponemon Institute.

Ponemon has been charting the cost of data breaches for the last 10 years and in 2014 began examining the correlation between the cost of data breaches and business continuity management’s involvement with cyber security teams in responding to them. This year, the study found that such involvement reduces breach costs by an average of US$14 per compromised record, from US$161 to US$147. Because data breaches can affect thousands of records, overall savings can be significant: BCM involvement can reduce the total cost of each data breach from US$3.8 million to US$3.5 million.

Identifying and containing a data breach quickly is instrumental to limiting its impact and the study found that business continuity involvement can reduce the mean time to identify a data breach from 234 to 178 days, and the mean time to contain a data breach from 83 to 55 days.

Perhaps most important, the study found that BCM involvement with security operations can actually reduce the likelihood of data breach. According to the Ponemon study, the likelihood of a data breach involving 10,000 or more records striking a company that involves BCM in security operations is 21.1%, compared to 27.9% for organizations that have no BCM involvement with security. And if a breach does occur, it will negatively affect the business operations of only 55% of organizations that involve BCM with security, compared to 80% of organizations with no such involvement.

Clearly, BCM involvement with security operations can help limit the instances of data breach and mitigate the damage caused if a breach does occur. Organizations now understand this, and are finding ways to coordinate security and BCM response to breach. According to the Ponemon study, roughly 50% of the companies polled now have BCM involvement in data breach response planning and execution, up from 45% in 2014.

For further information on how business continuity management and security operations can work together to limit the impact of a data breach, read the IBM White Paper - Business continuity management: security can work together to safeguard data.

The technology industry today is transforming its approach to assessing and managing third parties for bribery and corruption risk. As if it wasn’t already a massive challenge for organizations to keep up with new and ever-changing legislation and regulations, FCPA enforcement has elevated to a whole new level of intensity with the DOJ putting heavy resources behind taking action.

But fines are just the tip of the iceberg, and even greater expense may be incurred in pre- and post-enforcement activity. Investigations and their associated legal fees often far exceed the actual fines. In many cases, they can run to five or 10 times more. Post-enforcement costs – updating policies, increased training and dealing with monitors – can also be significant and may last years. In addition, FCPA violations can have a damaging and public effect on a company’s reputation and long-term revenues.

The reality is that the many FCPA risks arise from relationships with third parties — agents, brokers, distributors, suppliers, etc. who may interact with foreign governments or agents. The following points are red flags that require input from your third parties:



(MCT) - A slow-moving storm that has left parts of Charleston underwater dumped a foot of rain on the Columbia area since midnight.

The historic rainfall submerged low-lying traffic intersections around Columbia including Devine Street and Rosewood Drive and areas around Decker Boulevard.

Richland County declared a state of emergency Sunday, which allows the county to seek help from state emergency officials and buy emergency equipment and supplies.



Global assets under management (AuM) are set to swell to US$102 trillion in 2020 and according to a new report from PwC, the tax function, which is about to undergo significant change, will be critical in determining those players in the market who will be best positioned to win greater share of business in the lead up to it.

According to the report, ‘Asset Management 2020 and beyond: Transforming your business for a new global tax world’, as banks and insurers retreat from many business lines, asset managers are becoming more influential across a range of products, creating a new breed of global mega-managers. This is attracting huge focus from tax authorities, who, come 2020, will have specialist teams with the capabilities to carry out much more detailed enquiries than in the past, and the powers to request real-time investor-related information.

Investors, therefore, will expect asset management providers to have robust and efficient tax infrastructures. They will have minimal tolerance of tax uncertainty or tax adjustments and gravitate towards providers that offer products reflecting investor-specific tax profiles. Prospective investors will ask about tax disclosures even taking their individual tax charge into account before they consider investing in a fund. They will seek more certainty with respect to tax issues.

Portfolio taxation will become a key battleground

When launching new products, therefore, asset managers will routinely have to carry out full assessments to make them competitive in all channels. With more transaction taxes, local withholding and self-assessment capital gains regimes, every asset purchase and sale will have to be carefully examined from a tax risk and reporting perspective. This will require asset managers to have real-time access to data on global tax regimes.

PwC expects a number of integrated businesses combining asset management, wealth management and private banking activities with the ability to provide a full tax advisory service to clients, to emerge.

“In the lead up to 2020, investors’ evaluation on how their portfolios perform will focus predominantly on post-tax yields.  Asset managers therefore, will have little choice but to respond by dispersing their strategic tax resources throughout their business operations to give front, middle and back office staff access to real-time expertise,” says PwC’s William Taggart, Global  Tax Leader, Asset Management.

“In tandem in-house asset management tax teams will need to evolve to deal with perpetual audits and to engage with tax authorities on a frequent basis to influence policy and help guide the implementation of tax rules.”

Tax technology will be key to performance and client satisfaction

Technology for tax will enable investment firms to make timely tax-informed investment decisions and provide investors and tax authorities with the transparency and reporting they demand. It will also create the ability to differentiate between the alpha - the return in excess of a benchmark index or "risk-free" investment, created by the portfolio manager and that created (indirectly) by the capability of the tax team, to manage tax leakage and tax risk.

Technology will not only be close to the heart of asset managers – the tax authorities will also have made significant investments by 2020 too hence the age of selected paper-based reporting by asset managers to the tax authorities will be over. Tax authorities will request whatever information they want from asset managers through having direct access to their IT systems rather than asset managers pushing data to them.

Taggart concludes:

“Tax and reputation in the world of asset management, will be inseparable. The increased complexity of the tax function will require that it spends significant periods of time with operational activities in order to be able to act as a trusted advisor internally and to key executives. Asset managers will need to ensure highly-skilled tax people are brought into the heart of the business. The tone needs to be set at the top. The tax function is critical to the entire operation and senior management will need to make sure this is well understood.”

Notes to Editors

To help asset managers plan for the future, PwC’s report ‘Asset Management 2020 and beyond: Transforming your business for a new global tax world’ sets out a vision of what the tax landscape will look like in 2020 and beyond, and examines what it means for asset managers and their clients. The report recognizes that change will come incrementally, but should be started soon with a long term strategic vision of how the tax function should operate, how it is resourced, and its role within the overall business, in mind.  The report then sets out the characteristics of such a vision.

About PwC

PwC helps organisations and individuals create the value they’re looking for. We’re a network of firms in 157 countries with more than 195,000 people who are committed to delivering quality in assurance, tax and advisory services. Find out more and tell us what matters to you by visiting us at www.pwc.com.

© 2015 PwC. All rights reserved

PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure​​ for further details.

Scientists working off west Africa in the Cape Verde Islands have found evidence that the sudden collapse of a volcano there tens of thousands of years ago generated an ocean tsunami that dwarfed anything ever seen by humans. The researchers say an 800-foot wave engulfed an island more than 30 miles away. The study could revive a simmering controversy over whether sudden giant collapses present a realistic hazard today around volcanic islands, or even along more distant continental coasts. The study appears in the journal Science Advances.

"Our point is that flank collapses can happen extremely fast and catastrophically, and therefore are capable of triggering giant tsunamis," said lead author Ricardo Ramalho, who did the research as a postdoctoral associate at Columbia University's Lamont-Doherty Earth Observatory, where he is now an adjunct scientist. "They probably don't happen very often. But we need to take this into account when we think about the hazard potential of these kinds of volcanic features."

The apparent collapse occurred some 73,000 years ago at the Fogo volcano, one of the world's largest and most active island volcanoes. Nowadays, it towers 2,829 meters (9,300 feet) above sea level, and erupts about every 20 years. Santiago Island, where the wave apparently hit, is now home to some 250,000 people.



(MCT) - Ten years ago, Hurricane Katrina obliterated the political career of then-Federal Emergency Management Agency Director Michael Brown with the same savage brutality that it crushed the city of New Orleans.

“Truthfully, it was devastating,” said Brown, a Guymon native who resigned as director of the agency that coordinates federal disaster relief efforts in 2005 after being pilloried in the media for the government's response to Hurricane Katrina's destruction.

“People blame you for the deaths of people. ... It was the low point of my life,” said Brown, who has an undergraduate degree from Central State University (now the University of Central Oklahoma) and a law degree from Oklahoma City University.



FEMA and the state of Texas are highlighting Texas communities that have taken steps to reduce or eliminate long-term risk to people and property


DALLAS – After years of serious flooding, Dallas officials made a decision to reduce flood risk by redesigning an important ecosystem located in the heart of the city. The outcome not only solved a major problem, it resulted in a beautiful outdoor recreation area.

Historically, Dallas relied on dams and levees with grass-carpeted floodways to lower flood risk. But a problem that was unique in origin had become an obstacle that was demanding a non-structural solution.

The significant contributor to the flooding problem was that the Great Trinity Forest was coming back to life. This 6,000-acre forest stretches from the edge of downtown Dallas along the Trinity River to Interstate 20.

Much of it had been lumbered, ranched and farmed. However, over the last century, farmers and ranchers gradually abandoned the croplands and pastures. As a result, the trees and brush grew back into an increasingly dense forest, impeding Trinity River drainage through the city.

Six thousand square miles of watershed exist above downtown Dallas. That area drains through the half-mile-wide Dallas floodway in a levee-lined channel near downtown skyscrapers. When the river exits the levee system it immediately enters the Great Trinity Forest, which acts as an impediment.

Floodwaters would slow and back up the downtown levee system, occasionally claiming lives and damaging or destroying homes and businesses. In the early 1990s the U.S. Army Corps of Engineers (USACE), the city of Dallas and the U.S. Fish and Wildlife Service collaborated on a plan to solve the problem. The agencies arrived at an environmentally-friendly, comprehensive flood risk management solution that avoided traditional concrete lined channels or a sterile grass-carpeted floodplain.

Called “the chain of wetlands,” the proposal was to build a pathway through the Great Trinity Forest to efficiently carry floodwaters through the upper reaches of the forest and an old landfill and golf course to alleviate the backup. The project to construct the manmade chain of interconnected wetland ponds called for the removal of 271 acres of woody plants, including many trees, which would give clear passage for floodwaters.

Then, the bottomland forest would be replaced with richer and far more diverse wildlife habitat. Under National Environmental Policy Act requirements, removal of the woody plants required planting take place elsewhere. Consequently, the tree removal was offset by planting a higher-value habitat in the southern portion of the Great Trinity Forest farther downstream.

Trees, bushes and vines were specifically selected to provide food and cover for wildlife. Directed by the USACE Lewisville Aquatic Ecosystem Research Facility, students from the University of North Texas, Texas A&M University, Collin Community College and North Texas Central College helped plant native Texas plants in the new wetlands ecosystem and in the mitigation area downstream.

Although this comprehensive project is a work in progress, the initiative has already shown impressive results. The project helped transport floodwaters from the record May 2015, rains that were followed weeks later by the remnants of Tropical Storm Bill. The waters flowed effectively through the Dallas system as designed, reducing the risk to life, safety and property damage in the Trinity River watershed.

The Corps of Engineers estimates its integrated flood-risk reduction system, which includes reservoirs in the Upper Trinity River basin, prevented $6.7 billion in damages from the spring storms.

“Without the trees, the water now flows more efficiently through the upper reach of the Great Trinity Forest,” said Jim Frisinger, public affairs specialist, Fort Worth District USACE. “This new wetlands complex, which included planting trees downstream, proves ecosystem restoration paired with flood risk reduction can help solve challenging urban flooding issues. There is no doubt that Dallas would have been in far more trouble without this solution.”

The Upper Chain of Wetlands Fact Sheet  has additional information about this project.

To learn more about how cities and towns across Texas are building stronger, safer communities visit Best Practice Stories | FEMA.gov.


FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. 

Seeking public comments on proposed 15-year ecosystem plan


The BP Macondo Deepwater Horizon oil rig exploded on April 20, 2010. Approximately 3.19 million barrels (134 million gallons) of oil were released into the Gulf of Mexico, making it by far the largest offshore oil spill in United States history. (Credit: US Coast Guard)

The BP Macondo Deepwater Horizon oil rig exploded on April 20, 2010. Approximately 3.19 million barrels (134 million gallons) of oil were released into the Gulf of Mexico, making it by far the largest offshore oil spill in United States history. (Credit: US Coast Guard)

NOAA and the other Deepwater Horizon Natural Resource Trustees today released 15-year comprehensive, integrated environmental ecosystem restoration plans for the Gulf of Mexico in response to the April 20, 2010 Deepwater Horizon oil rig explosion and spill. Implementing the plan will cost up to $8.8 billion. The explosion killed 11 rig workers and the subsequent spill lasted 87 days and impacted both human and natural resources across the Gulf.

The Draft Deepwater Horizon Oil Spill Draft Programmatic Damage Assessment and Restoration Plan and Draft Programmatic Environmental Impact Statement allocates Natural Resource Damage Assessment  monies that are part of a comprehensive settlement agreement in principle  among BP, the U.S. Department of Justice on behalf of federal agencies, and the five affected Gulf States announced on July 2, 2015. The Department of Justice lodged today in U.S. District Court a consent decree as part of the more than $20 billion dollar settlement. 

In the draft plan, the Trustees provide documentation detailing impacts from the Deepwater Horizon oil spill to:

  • wildlife, including fish, oysters, plankton, birds, sea turtles, and marine mammals across the Gulf
  • habitat, including marshes, beaches, floating seaweed habitats, water column, submerged aquatic vegetation, and ocean-bottom habitats
  • recreational activities including boating, fishing, and going to the beach

The Trustees determined that “overall, the ecological scope of impacts from the Deepwater Horizon spill was unprecedented, with injuries affecting a wide array of linked resources across the northern Gulf ecosystem.” As a result of the wide scope of impacts identified, the Trustees “have determined that the best method for addressing the injuries is a comprehensive, integrated, ecosystem restoration plan.”

Both the consent decree and the draft plan are available for 60 days of public comment. The Trustees will address public comment in adopting a final plan. For the consent decree, once public comment is taken into account the court will be asked to make it final.

Bottlenose dolphins, who had to swim through heavily oiled waters, suffered serious reproductive and adverse health effects from the oil., some of which are still being determined. (Credit: NOAA)

Bottlenose dolphins, who had to swim through heavily oiled waters, suffered serious reproductive and adverse health effects from the oil., some of which are still being determined. (Credit: NOAA)

Public comments on the draft plan will be accepted at eight public meetings to be held between October 19 and November 18 in each of the impacted states and in Washington, DC. Comments will also be accepted online and by mail sent to: U.S. Fish and Wildlife Service, P.O. Box 49567, Atlanta, GA 30345. The public comment period will end on December 4, 2015.

The Trustees are proposing to accept this settlement, which includes, among other components, an amount to address natural resource damages of $8.1 billion for restoration and up to $700 million for addressing unknown impacts or for adaptive management. These amounts include the $1 billion in early restoration funds which BP has already committed. 

“NOAA scientists were on the scene from day one as the Deepwater spill and its impacts unfolded. NOAA and the Trustees have gathered thousands of samples and conducted millions of analyses to understand the impacts of this spill,” said Kathryn D. Sullivan, Ph.D., undersecretary of commerce for oceans and atmosphere and NOAA administrator. “The scientific assessment concluded that there was grave injury to a wide range of natural resources and loss of the benefits they provide. Restoring the environment and compensating for the lost use of those resources is best achieved by a broad-based ecosystem approach to restore this vitally important part of our nation’s environmental, cultural and economic heritage.”

NOAA led the development of the 1,400 page draft damage assessment and restoration plan, with accompanying environmental impact statement, in coordination with all of the natural resource Trustees. The draft plan is designed to provide a programmatic analysis of the type and magnitude of the natural resources injuries that have been identified through a Natural Resource Damage Assessment conducted as required by the Oil Pollution Act of 1990 and a programmatic restoration plan to address those injuries. Alternatives approaches to restoration are evaluated in the plan under the Oil Pollution Act and the National Environmental Policy Act.

Specific projects are not identified in this plan, but will be proposed in future project-specific restoration proposals. The Trustees will ensure that the public is involved in their development through public notice of proposed restoration plans, opportunities for public meetings, and consideration of all comments received.

The draft plan has an array of restoration types that address a broad range of impacts at both regional and local scales. It allocates funds to meet five restoration goals, and 13 restoration types designed to meet these goals.

  • The five overarching goals of the proposed plan are to:
    restore and conserve habitat
  • restore water quality
  • replenish and protect living coastal and marine resources
  • provide and enhance human use recreational activities
  • provide for long term monitoring, adaptive management, and administrative oversight of restoration

The 13 proposed restoration activities are:

  1. Restoration of wetlands, coastal, and nearshore habitats
  2. Habitat projects on federally managed lands
  3. Nutrient reduction
  4. Water quality
  5. Fish and water column invertebrates
  6. Sturgeon
  7. Submerged aquatic vegetation
  8. Oysters
  9. Sea turtles
  10. Marine mammals
  11. Birds
  12. Low-light and deep seafloor communities
  13. Provide and enhance recreational opportunities

Together, these efforts will restore wildlife and habitat in the Gulf by addressing the ecosystem injuries that resulted from the Deepwater Horizon incident.

Once the plan is finally approved and the settlement is finalized, NOAA will continue to work with all of the Trustees to plan, approve, and implement restoration projects.  NOAA will bring scientific  expertise and focus on addressing remedies for living marine resources — including fish, sturgeon, marine mammals, and sea turtles — as well as coastal habitats and water quality. NOAA scientists developed numerous scientific papers for the NRDA case including documentation of impacts to bottlenose dolphins, pelagic fish, sea turtles, benthic habitat and deep water corals.

The Deepwater Horizon Oil Spill Draft Programmatic Damage Assessment and Restoration Plan and Draft Programmatic Environmental Impact Statement is available for public review and comment through December 4. It is posted at www.gulfspillrestoration.noaa.gov and will be available at public repositories throughout the Gulf and at the meetings listed below.


Time (local times)      


Mon., Oct. 19, 2015

5:00 PM Open House
6:00 PM Public Meeting

Courtyard by Marriott – Houma
142 Library Boulevard
Houma, LA 70360

Tues., Oct. 20, 2015

5:00 PM Open House
6:00 PM Public Meeting

University of Southern Mississippi,
Long Beach
FEC Auditorium
730 East Beach Boulevard
Long Beach, MS 39560

Thurs., Oct. 22, 2015

5:00 PM Open House
6:00 PM Public Meeting

Hilton Garden Inn New Orleans
Convention Center, Garden Ballroom
10001 South Peters St 
New Orleans, LA 70130

Mon., Oct. 26, 2015

6:00 PM Open House
7:00 PM Public Meeting

The Battle House Renaissance
Mobile Hotel
26 N Royal St
Mobile, AL 36602

Tues., Oct. 27, 2015

6:00 PM Open House
7:00 PM Public Meeting

Pensacola Bay Center
201 E Gregory St
Pensacola, FL 32502

Thurs., Oct. 29, 2015

6:00 PM Open House
7:00 PM Public Meeting

Hilton St. Petersburg Bayfront
333 1st Street South
St. Petersburg, FL 33701

Tues., Nov. 10, 2015

6:00 PM Open House
7:00 PM Public Meeting

Hilton Galveston Island Resort
Crystal Ballroom
5400 Seawall Boulevard
Galveston, TX 77551

Wed., Nov. 18, 2015

6:00 PM Open House
7:00 PM Public Meeting

DoubleTree by Hilton
1515 Rhode Island Ave NW
Washington, DC 20005

All public meetings will begin with an interactive open house where the public can learn details of the assessment and proposed restoration activities. The open house will be followed by a formal presentation and opportunity for the public to provide comments on the draft plan, as well as on the proposed settlement with BP.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.


SACRAMENTO, Calif. – If you live in Calaveras or Lake counties and were affected by the recent wildfires and are insured, you may still be eligible for FEMA assistance.

By law, FEMA cannot duplicate insurance or other benefits. However, FEMA may be able to help survivors with uninsured or underinsured losses or if their insurance settlement is delayed. Applicants should notify FEMA of their situation and provide insurance company documentation.

If a survivor received a settlement from their insurance company and still has unmet disaster-related needs, they may be eligible for a grant.

If a survivor has exhausted the settlement from their insurance for Additional Living Expenses (ALE for loss of use) FEMA may be able to assist with disaster-related temporary housing.

If an insurance settlement is insufficient to cover disaster-related needs, survivors may be eligible for grants to cover emergency home repairs, disaster-related medical, dental and funeral costs and other disaster-related expenses.

If a survivor’s insurance settlement has been delayed longer than 30 days from the time they filed the claim, they should contact FEMA. After providing the necessary documentation – the claim number, date applied, and an estimate of how long it will take to receive a settlement – a survivor may qualify for an advance that would have to be repaid to FEMA once the insurance settlement is received.

Survivors can register for FEMA assistance online at DisasterAssistance.gov or by calling 800-621-3362; TTY 800-462-7585; 711 or Video Relay Service (VRS), call 800-621-3362.



A new survey has shown a profound lack of confidence among the UK public surrounding the ability of public and private sector organisations to protect their personal data from hackers.

According to the Bit9 + Carbon Black research, which comprised a poll of over 2,000 consumers, more than four in five Britons (81 percent) fear that unreported data breaches may already have put their information in the hands of hackers.

What’s more, almost three-quarters (73 percent) believe the time it takes for organisations to detect and report a breach is “unacceptable”.



Wednesday, 23 September 2015 00:00

IT Colocation Services and the Neighbours

Can’t afford your own data centre? Want to grow a small business and looking for somewhere else to put your IT servers? Colocation services might be the solution. The idea is that for a monthly fee, providers will give your company space in a purpose-built facility with cooling, redundant power supplies and resilient, high-speed network connectivity. Naturally, service levels and quality may vary, but colocation should be a cost-effective way of relocating your servers for security and square footage. What’s not to like? The neighbours, perhaps…



Boards, regulators and leadership teams are demanding more and more of risk, compliance, audit, IT and security teams. They are asking them to collaboratively focus on identifying, analyzing and managing the portfolio of risks that really matter to the business.

As risk management programs evolve to more formal processes aligned with business objectives, leaders are realizing that by developing a proactive mindset in risk and compliance management, teams can provide added value to help the organization gain agility by identifying new opportunities as well as managing down-side risk. Organizations with this new perspective are more successful in orchestrating change to provide a 360-degree view of both risk and opportunity.



This article provides an overview of GPG Professional Practice 3 (PP3) – Analysis, which is the professional practice that “reviews and assesses an organization in terms of what its objectives are, how it functions, and the constraints of the environment in which it operates”.


PP3 introduces and addresses the business impact analysis (BIA) as a primary means of analysis, leading to appropriate business continuity requirements.  PP3 identifies the following beneficial outcomes from the BIA:



Almost a quarter of businesses reported annual cumulative losses of at least $1.05 million (CAD $1.4 million) due to supply chain disruptions, and 76% of businesses reported at least one instance of supply chain disruption annually, according to a survey conducted by the Business Continuity Institute and Zurich. The top causes of supply chain failure among businesses surveyed were ones that will likely get even more frequent in the coming years: unplanned IT outages, cyberattacks, and adverse weather.

As the supply chain continues to grow ever longer, adding more potentially disruptive risks along the way, businesses are learning some painful lessons about the financial and reputational damages that can result from failures to ensure supply chain resilience.

Check out the infographic below for some Zurich’s top insights on supply chain visibility, including the biggest sources of damage and key steps to mitigate losses:



(MCT) - A man walked into a biology lab at Lamar University and sprinkled food into a fish tank, sustaining Trinidad-plucked guppies while the professor monitoring them was unable to tend to the subjects of his life's research.

It was a minor happening, but a win nonetheless while worry and stress gripped the Beaumont university reeling from $50 million in damage by Hurricane Rita, a storm that a top official said highlighted deficiencies in emergency preparedness and threatened to derail students' lives.

Ten years later, people across the entirety of Lamar's spectrum -- alumni, professors, officials and maintenance workers -- remembered how everyone came together to solve the most-pressing issue: resuming classes as quickly as possible to avoid canceling graduation.

They also point to structural changes they said alleviated some of the problems three years later during Hurricane Ike and should help Lamar University the next time a major storm strikes southeast Texas.



Wednesday, 23 September 2015 00:00

A Taste of Designing Mobile Experiences

Designers and engineers at Citrix‪‪ use human-centered innovation approaches such as Design Thinking to create compelling user experiences for mobile devices. Just a few recent experiments from our internal incubators show how designing with the user at the center of the stakeholder map can improve the overall UX, introduce new concepts on the market or applications of existing Citrix products in new verticals and for new use case scenarios.

For example, the Cubefree team created a Yelp-like app for mobile workers starting with low-fi prototypes, then iterating on both the product and the business model during a 3-month Citrix Startup Accelerator program. The PatientConsult team used a similar approach, starting with gaining empathy for doctors and specialists, identifying their specific needs, and prototyping an app for secure communication in the healthcare vertical. Not to mention the newly released Citrix Workspace Cloud that focuses on Citrix customer needs and seamlessly integrates multiple offerings to satisfy them!



Wednesday, 23 September 2015 00:00

Storage in a Diversifying Data Environment

Larger data loads are coming to the enterprise, both as a function of Big Data and the steady uptick of normal business activity. This will naturally wreak havoc with much of today’s traditional storage infrastructure, which is being tasked with not only with providing more capacity but speeding up and simplifying the storage and retrieval process.

Most organizations already realize that with the changing nature of data, simply expanding legacy infrastructure is not the answer. Rather, we should be thinking about rebuilding storage from a fundamental level in order to derive real value from the multi-sourced, real-time data that is emerging in the new digital economy.



Network World took a look at a study by tyntec that suggested that “a vast majority” of companies don’t protect themselves adequately from BYOD issues. About half (49 percent) of these firms have employees that at least partially use their own devices at work, which poses huge security risks. To that end, Molson Coors’ CIO Christine Vanderpool offers three lists that look at the risks of BYOD, risk issues to keep in mind and data access and security considerations.

Two surveys by Bitglass were highlighted by eWeek where they found that employees and even IT personnel are not happy with mobile device management (MDM) platforms, which they fear can access, alter or delete personnel data.

People who work for an organization don’t want to be in a situation in which their personal data is under the control of their employer. The most telling statistics from the surveys show that IT personnel – the very people who will be called upon to make such programs work – are almost as skeptical as the folks from PR and accounting about MDM platforms and BYOD:



Wednesday, 23 September 2015 00:00

VVOLs and VMware

The definition of VVOLs is simple but the effect is ground-breaking. Here is the simple definition part: Virtual Volumes (VVOL) is an out-of-band communication protocol between array-based storage services and vSphere 6.

And here is the ground-breaking part: VVOLs enables a VM to communicate its data management requirements directly to the storage array. The idea is to automate and optimize storage resources at the VM level instead of placing data services at the LUN (block storage) or the file share (NAS) level.

VMware replaces these aggregated datastores with one Virtual Volume (VVOL) endpoint whose data services match individual VM requirements. VVOLs enable more granular control over VMs and increase their visibility on the storage array. Note however that the array still operates within its own limitations. If an administrator has applied a policy to the VM with a specific snapshot schedule and the array cannot comply, then the VM doesn’t get that schedule.



Wednesday, 23 September 2015 00:00

Layering Governance Over Cloud

As the latest Amazon earnings announcement for AWS suggests, enterprises have adopted cloud at a rapid pace over the last few years as a part of the emerging Bimodal IT paradigm. However, given the focus on cost and agile development, the sourcing of cloud vendors has typically been cost-based, and the governance framework adopted across empirical. The recent Sony cyberattacks have proved beyond doubt, that enterprise data is the biggest source of competitive advantage in today’s digital era and needs to be preserved and protected at all costs. Today, as critical business processes and data have started moving to the cloud, there is an increasing clamour for newer and more specific risk and control measures to ensure information security. At the same time, the threat landscape and information security requirements changes with each vendor, location, service, business priority and more. But, this does not and should not mean that organizations need re-invent their cloud management systems and governance processes again every time the threat landscape evolves.

As the phenomena of cloud-based software deployments become the new normal, enterprises need to take a deeper and renewed look into Information Security and Risk Management instead of perpetually trying to re-build their Governance, Risk and Compliance (GRC) programs to keep pace with regulations and emerging cloud service models and technologies. The modern and leading organizations of tomorrow need to adopt a layering approach. Organizations need to create a single GRC layer over their cloud ecosystem, which can expand across multiple cloud vendors and models. The layering approach is imperative to ensure the cloud ecosystem can scale securely across the following attributes:



Things are seriously bad when one of the world’s most respected business focused publications, the Financial Times (FT), asks if the auto “industry faces ‘Libor moment’”? Yet that was a headline yesterday in the lead article in the FT about the still expanding crisis involving the auto manufacturer Volkswagen (VW) and its emission test cheating that has come apart over the past few days. Last week, the US accused VW of rigging its 500,000 American diesel cars so they would pass emissions requirements when being tested yet belch out 30%-40% more pollution when in actual operation. VW accomplished this through software that could distinguish between testing and operation.

What do you think the chances are that VW was not aware that the ‘defeat device’ software was in its vehicles? Anyone out there think that VW negligently installed and upgraded software through multiple product lines for over 6 years in upwards of 11 million autos? If you do it may be time for a very long session on the meaning of the word intentional. 

However the world was stunned this week when not only VW admitted that it had installed software to provide incorrect data on emissions tests around its diesel vehicles in the US but, as reported in the online publication Slate, “the German car manufacturer announced that 11 million of its cars were fitted with diesel engines that had been designed to cheat emissions standards.” Obviously the culture of the company comes into serious question when such a worldwide, multiyear, systemic plan is designed and implemented to break the law.



Geo-clusters are something that I often get asked about, especially from clients who are looking to protect mission-critical applications and mitigate the chances of data going missing. In this post, we’ll analyse what they are and what they can be used for.

What is a geo-cluster and how can it help prevent data loss?

In order to address what a geo-cluster is, it is first important to understand the concept of a Database Availability Group (DAG).  A DAG allows an organisation to have up to 16 replications of an Exchange Database (EDB). Where we can see this come into play is in a situation (e.g. server failures, offline server) where users are prevented from accessing the primary Exchange server. A more detailed explanation of potential scenarios and how to implement DAGs can be found here. Another important term to understand is High Availability (HA), which Microsoft defines as “the implementation of a system design that ensures a high level of operational continuity over a given period of time.”



Hackers have leveraged malicious code to attack apps commonly used on Apple (AAPL) iPhones and iPads in China.

And as a result, Apple tops this week's list of IT security newsmakers to watch, followed by IBM (IBM), Vodafone (VOD) and the VisitorTracker malware. 

What can managed service providers (MSPs) and their customers learn from this week's IT security newsmakers? Check out this week's edition of IT security stories to watch to find out:



Preventing data breaches in an organization requires a strong collaborative effort between the HR and IT departments—a collaboration that may even involve a blurring of the line between those traditionally separate functions.

That’s the assessment of Jacqui Summons, international HR director at Clearswift, a provider of data loss prevention technology in the UK. I had the opportunity to speak with Summons about this topic recently, and I began the conversation by asking her to provide an overview of what HR’s role should be in preventing data loss. She said the role is one that HR directors are slowly adopting:



(MCT) - We all know that it's only a matter of when, not if, the Big One hits.

Yet so few of us are prepared for a sizable earthquake or another disaster.

"We've got it pretty good here," said Chris Ipsen, a spokesman for the Los Angeles City Emergency Management Department. "We live in an area that has a lot of resources, the weather is excellent … people just get real comfortable."

In many other parts of the country and around the globe, light switches don't always work and tornadoes, torrential rains and crippling snowstorms are a seasonal occurrence. But in Southern California, Ipsen said, "the mentality can be, 'It's not going to happen, and if it is going to happen, it's not going to happen to me.'"



New research from two security companies shows that DDoS attacks are a lot more serious than previously thought. The nuisance attacks are doing more than shutting down websites, shutting out customers, and giving IT staff the unwanted task of fixing the problems. They are now being used for malware downloads and resulting in data loss.

Kaspersky Lab reported that companies have a one in five chance of being the victim of a DDoS attack. Worse, nearly one out of every three DDoS attacks coincided with a network intrusion, leading 31 percent of small business and 22 percent of larger businesses to suffer data loss. In fact, of the 5,500 respondents to its survey, 32 percent said that the DDoS attack happened in conjunction with a network intrusion.

In a release, Evgeny Vigovsky, head of Kaspersky DDoS Protection, stated:



(MCT) – First a 1,200-pound bomb inside a rental van blew a crater into the base of the World Trade Center in 1993, killing six.

Then a rental truck packed with fertilizer exploded in front of the federal building in Oklahoma City in 1995, killing 168 people.

The detonation of a backpack nail bomb a year later inside a public plaza -- killing one person in Centennial Olympic Park in downtown Atlanta -- was the final straw. Those attacks inspired a new approach for protecting Americans and visiting dignitaries at large events from the growing threat of terrorism and violence on U.S. soil.



The more complex a system becomes, the greater the chance it will experience failure. And as more people start putting their data on the cloud, more security issues have been cropping up.

No doubt, the cloud has not experienced catastrophic security failures. Still, traditional IT sees this far too often. But, can isolated incidents – such as the hack of celebrity iCloud accounts, password theft of Dropbox, and the PlayStation network attack compromising the data of over a 100 million customers –  point to a trend that might flare up in the coming years?



Everybody likes self-service these days. We have self-service gas, self-service car washes--heck you can even self-service your mortgage application with just a few mouse clicks.

So it’s no surprise that knowledge workers are bringing this ethos to the office and bumping up against the idea of someone else telling them what resources and applications they can use to do their jobs, and how to get them. In organizations that push back against self-service, many employees simply seek their data infrastructure elsewhere, driving up levels of shadow IT.



By Ben J. Carnevale

Nearly two years in the making, on September 15th, 2015, a 98 page document – e.g. the ISO 18788:2015 Management System for private security operations — has been published.

Among its many benefits, this standard provides the principles and requirements for a security operations management system (SOMS) …and… a framework for establishing, implementing, operating, monitoring, reviewing, maintaining and improving the management of security operations for organizations conducting or contracting security operations and related activities and functions.

Just as important, this document also demonstrates: (a) conduct of professional security operations to meet the requirements of its clients and other stakeholders, (b) accountability to law and respect for human rights, and (c) consistency with voluntary commitments to which it subscribes.

Tuesday, 22 September 2015 00:00

3 Steps to Failover

When it comes to disaster recovery and keeping your business running, there are three key steps to take, no matter the scale.

Whether it’s a large-scale disaster, a crashed server or even just a file that gets deleted, it’s important to properly assess the situation, act on it with a plan and get things back to normal. In our case at Net Sciences, we were hit with three hosts, seven servers, an entire cluster--all down.



Like many of you, I have a number of routine checks that I run on the Exchange servers to keep them in good health. One of those areas is managing user mailbox quotas. I’ll often spend a couple hours a week with users to help them implement a mailbox storage diet and explain the importance of keeping their emails managed properly. However more interestingly, I have been asked numerous times how this can relate to data loss and what actions can be taken to prevent this from occurring. In this post, the second of our Exchange series, we’ll be taking a look at this in more detail.



Tuesday, 22 September 2015 00:00

The Myth of Resources Required Over Time

In many organizations, buried somewhere in their Business Impact Analysis (BIA), is a form asking participants to designate what Resources (computers, phones, printers – even desks and chairs) they will require if their normal business operations are disrupted.

That sounds like a reasonable request.  For years the concept of Resources-over-Time has slithered its way into the ‘standards’ many organizations (and many BCM software products) follow as part of the BIA process.  But without knowing what the disruption will be,  when it will happen, how severe it will be or how long it may last, is it possible to predict what Resources will be needed?

Suppose you were going to go on a hike in the wilderness.  How much food and water would you bring?  You’d need to answer some questions first:  How long is the hike?  What will the temperature be?  Without those facts, you can only guess what you’ll need to pack.  You risk either running short – or over-packing and needlessly increasing the weight of your backpack.



Tuesday, 22 September 2015 00:00

The Business of Visual Storytelling

When was the last time you heard a really great story from one of your customers?

Chances are you hear them all the time – but why keep them to yourself? Spreading those stories across your organization can be a valuable knowledge-sharing tactic.

Storytelling has been a natural hobby of mine forever, but it’s also my favorite way of learning for business. Give me a list of specs, features or names and chances are that I’m not going to remember much about them. But tell me a story about the benefits of how those features can be applied and I’ve got perspective that will make the idea stick. Draw me a picture and I’ll get it even faster – and likely be able to tell the story myself.

Last October, I was approached by Sue Morgan, Sr. Program Manager within Customer Experience (CX). She had taken notice of the great responses we received when we used pictures to explain new features or products. It’s a great way of communicating, but she wondered how we might be able to use drawing for a more outside-in approach. Together, we gathered a handful of artists for a long-term experiment.



What does a prepared community look like?

As communities look at how to prepare for the next emergency, they usually focus on stockpiling emergency supplies, having clear alert networks and ways to communicate with the public, and designating evacuation routes and shelter locations. While all of these are key aspects of emergency planning, one area of preparedness that is often overlooked is community health. Community Health is a term used to describe the state of health and how easy or difficult it is to be healthy where people live, learn, work and play. The health of a community, including ease of access to medical care and community resources available for exercise and encouraging healthy habits, is an important part of emergency planning that can have a positive impact on a community before, during, and after a public health emergency.

What is a Healthy Community?

Woman Selling Fresh Cheese At Farmers Food Market

A healthy community is one in which local groups from all parts of the community work together to prevent disease and make healthy living options accessible. Working at the community level to promote healthy living brings the greatest health benefits to the greatest number of people. It also helps to reduce health gaps caused by differences in income, education, race and ethnicity, location and other factors that can affect health. Healthy communities commonly have high vaccination rates to protect citizens from diseases and easy access to medical care and healthy food; are designed for healthy living at home, work, and school; and provide good mental health resources. Often, this also means it is safe and easy to walk, bike, and play in parks and community spaces.

How is a Healthy Community Better Prepared?

Communities that have good health resources in place and healthy community members can often recover after a disaster more quickly and with less negative health issues. Individuals who are in good physical shape, have proper vaccinations, have access to clinical services and medications, and know where to get critical health and emergency alert information, can better recover from a disaster and are more likely to be able to contribute to a community’s recovery efforts. After a natural disaster people may be displaced or may be gathered or taking shelter in crowded group settings. When there is a large number of people gathering or living in these crowded areas, it is imperative that people are up-to-date on their vaccinations in order to reduce the spread of disease.Nurse talking to mother and daughter

Unhealthy communities often have a large number of individuals that are more vulnerable before, during, and after a disaster. Factors that lead to poor health in communities such as high rates of chronic diseases like diabetes and heart disease, limited access to general medical care, and low levels of health education, can cause substantial difficulties for a community recovering from an emergency event. Gaps in medical care can increase significantly after a disaster due to physical damage to health care facilities or from a large increase in the number of people who need medical attention. People who already have poor health are usually more susceptible to disease during a public health emergency and cannot get the normal day-to-day medical care they need.

Make Your Community Healthy and Prepared

People passing sand bags down a line to prepare for a hurricane

You can help improve the health of your community by taking a look at your health and the health of your family. Take actions to ensure that you are as healthy as possible. Before an emergency, if you eat well, get regular checkups and vaccinations, and are physically active, your body will be better able to handle the stress and physical demands of recovering from a disaster. Washing your hands regularly can also help reduce your chances of getting sick during and after an emergency.

Help promote health in your community by becoming more engaged in your community. Encourage local community groups and government organizations to consider community health in their emergency preparedness plans. Take action to improve your community’s health now to ensure you are better prepared to remain healthy when an emergency occurs.

Monday, 21 September 2015 00:00

Linus enter the BCI Hall of Fame

Linus enter the BCI Hall of Fame

To win a BCI Award shows a high standard of excellence, it shows that you stand tall among your rivals and act as a beacon for others to aspire to. To win a BCI Award on a regular basis however, that takes something extra special.

The BCI is pleased to announce that the latest entry to the Hall of Fame is Linus – winner of Continuity and Resilience Provider (Service/product) award at the Australasian Awards in 2013, 2014 and now 2015.

Being invited to join the BCI Hall of Fame is something that we value highly, as it recognises that winning awards over a number of years is a remarkable achievement" said Saul Midler, CEO of Linus. "The fact that Linus Revive has been recognised as the best BC Product in Australasia for several years demonstrates the consistently high-level of applicability our software and services have in the community."

The Business Continuity Institute’s Hall of Fame, set up in 2015, is for those who have not only displayed a high standard of achievement, but have done so consistently. As such, only those who have won three BCI Awards within the same category will be permitted to enter.

Monday, 21 September 2015 00:00

MSPs: How to Handle the Top 5 Cloud Myths

In retrospect, the guys that came up with cloud computing could have done a better job of naming it. Sure, the technology got its nebulous name primarily due to its lack of any well-defined boundaries, but to anyone not familiar with tech-speak, the word “cloud” usually inspires weather-related images. Even people that are familiar with some aspects of cloud-based file sharing services are walking around with notions about which are just plain wrong.

That said, the damage is done. As a result, many an MSP with a cloud-based model may have to wade through a plethora of myths before they can get their prospects to start taking them seriously.

Some of the top 5 myths that are doing the round out there might shock you. Yet, dealing with these myths is extremely important and could impact your sales efforts, so be sure to anticipate questions related to them. Or, better yet, combat them before your prospect even asks.



Monday, 21 September 2015 00:00

Selling Proactive Response

Perhaps the most useful thing about managed services isn’t just the easy pricing model or the simplicity the services bring to small businesses; it’s in being proactive about potential issues.

I recently spoke with a StorageCraft partner who had a story about two types of people. One type knows the value in technology and is willing to invest in it, and one doesn’t. This partner had a promo offer of two free service hours to local businesses. The idea is typically to come in and assess networks and offer a few suggestions, and hopefully win a larger managed service contract. One business that reached out said they wanted to pocket the two hours and use them when there was a problem. Our partner told the business owner to keep the hours for later, but mentioned that they’d still come and do a free evaluation to identify potential problems. The business still refused.



The enterprise is formulating big plans for Big Data, but first there is the little matter of deploying big infrastructure to handle the load.

To be sure, not all of the data generated by legions of smartphone apps and RF-connected sensors will need to be compiled in a central repository. Much of it will be too fleeting to be of any use after a few minutes: think optimized search results for recent logins or sales specials based on the buying histories of in-store customers. These are best handled by automated on-site or near-site systems.

Still, large amounts of data will head back to the data center where it can be used to chart historical trends, update user records and generally optimize and refine business processes. For these volumes, the most readily available solution is the data lake, which is part repository, part warehouse and part analytics engine—but wholly expensive and complex.



Monday, 21 September 2015 00:00

Is Wall Street ready for the next cyberattack?

Take a deep breath, and imagine a doomsday scenario on Wall Street: a hacktivist group coordinates a large-scale, three-day attack on the capital markets meant to disrupt trading and confidence in the U.S. markets.

That's what cybersecurity firm SIFMA tried to simulate in a Wednesday experiment—where they found that banks might be limited during a hacking by laws that restrict information sharing.

"It's an inevitable instance that we're going to have cyberattacks," said Kenneth Bentsen, SIFMA's CEO said Wednesday on CNBC's "Closing Bell." "We have to work not just on prevention, but on response and recovery. And that's what these exercises are all about."



Continuting our technical deep dive series on the Applications and Desktops service, here is a blog from one of our star engineers, Daniel Seltzer.

Here is our next step toward simplification. Citrix Applications and Desktops Service sets up the control plane with best practices in minutes, reducing the overall need of design and deploy.

Today, we are introducing Remote Powershell SDK. You can now perform operational tasks even without logging into the user interface. Administrators can now automate operations such as creation of machine catalogs and delivery groups in the same way it’s done with XenApp and XenDesktop.

However, there are subtle differences.



Over the past 60 years, there have been over 2,000 major disasters declared in the United States. When disaster strikes, the economy takes a serious hit. Some businesses suffer financial loss so great that they never reopen. Natural disasters such as Hurricane Katrina and Hurricane Sandy have devastated local communities almost to the point of no return, costing billions in reparations to infrastructure, businesses and the lives to those personally affected.

However, hurricanes are not the only natural disasters that pose a threat to business continuity. Events such as tornadoes, floods, fires, and snow storms all leave business vulnerable without proper disaster preparedness planning. With cloud-based systems like OfficeSuite® Phone, business have unlimited and remote access to phone and communications systems to ensure operations stay up and running.

Sources: Wall Street Journal, National Center for Environmental Information, and The Insurance Information Institute

Nicole is the Marketing Communications Specialist for Broadview Networks, a top 10 UC cloud provider in the nation, where she enjoys writing about the latest technology and cloud products businesses can leverage to maximize productivity, improve security and reduce costs.


The misclassification of freelancers has emerged as one of the biggest storylines within the booming gig economy. As we’ve all seen from the onslaught on seemingly endless lawsuits, the cost of non-compliance can be staggering.

Fines levied by the U.S. Department of Labor (DOL), IRS and state agencies for worker misclassification can exceed millions depending on the severity of the infractions. As more and more companies begin leveraging independent contractors, it’s paramount that they arm themselves with the tools, processes and information needed to mitigate their compliance risk.

The following list of compliance risks, while certainly not exhaustive, highlights the critical need for companies to take proper steps to ensure their contractors and employees are properly classified.



Friday, 18 September 2015 00:00

If not a BIA Survey, What? (Part 1)

Some time ago eBRP posted a blog article I chose to call “The BIA Survey: an Effort in Futility

I’ve been asked why I haven’t published a follow-up article (as the original implied).  It’s less lack of inertia and more a wish to avoid controversy.  Not everyone agreed with my original premise; fewer still may agree with my solution.

The Business Impact Analysis (BIA) Survey has evolved over time into an often massive undertaking.  Organizations devote 6 months or more to determining the Survey questions, distribution, collection, collation, analysis – and the inevitable follow-ups to resolve discrepancies, anomalies and misconceptions.  Upon completion, not only are the results suspect (see my original article) but during the process, organizational changes often make some results invalid.



12 projects to aid coastal resilience, safeguard the public, ecosystems and coastal economies
Cellular look at Pseudo-nitzschia, a harmful algal bloom that is threatening health of humans, marine mammals by creating toxins in filter feeding fish and shellfish. (Credit: NOAA).

Bloom of Karenia brevis (red tide) leads to large fish kill in Texas. (Credit: With permission from The Brazosport Facts)

NOAA announced today 12 new research grants totalling nearly $2.1 million that will go to organizations from around the country seeking to address harmful algal blooms (HABs) and hypoxia, two of the most scientifically complex and economically damaging coastal issues.

Hypoxia and harmful algal blooms have become a national concern. Outbreaks of toxic algal blooms along the Pacific coast have shut down commercial and recreational shellfishing in portions of three states. Also, the large oxygen-depleted “dead zone” in the Gulf of Mexico imperils valuable commercial and recreational fisheries, and the persistent Lake Erie bloom has threatened public water supplies and the area’s $12.9 billion tourism industry.

“Understanding and predicting if an algal bloom will become toxic remains one of the biggest technical challenges,” said Mary Erickson, director of NOAA’s National Centers for Coastal Ocean Science, which is providing the funding. “These projects will help communities and agencies understand, detect, and predict toxic algae and hypoxia. They are part of a larger NOAA effort to develop a national network of ecological forecasts to protect communities and make them more resilient to these threats.”

The grants will allow these organizations to implement new monitoring technologies to address emerging HABs, and investigate the role of climate change, nutrient pollution, and other factors to better predict and manage blooms. They will also improve upon current monitoring and seasonal forecasting for HABs, as well as apply robotic technology to improve hypoxia monitoring. A list of the grants can be found here.

Distribution of dissolved oxygen in bottom water west of the Mississippi River delta (July 28–August 3, 2015). Black line denotes area with less than two milligrams of oxygen per liter of bottom water. (Credit: With permission from Nancy N. Rabalais, LUMCON, and R. Eugene Turner, LSU).

Distribution of dissolved oxygen in bottom water west of the Mississippi River delta (July 28–August 3, 2015). Black line denotes area with less than two milligrams of oxygen per liter of bottom water. (Credit: With permission from Nancy N. Rabalais, LUMCON, and R. Eugene Turner, LSU)

“Advancing NOAA’s ecological forecasting initiatives depends on sound science-based information that private and public officials need to make critical decisions to protect public health, understand environmental impacts, and mitigate economic damages to activities that are a vital part of the region’s economy,” said Russell Callender, Ph.D., acting assistant NOAA administrator for the National Ocean Service.

Every U.S. coastal state has suffered a bloom of harmful algae over the last decade, and species have emerged in new locations that were not previously known to have problems. A small percentage of blooms produce toxins or grow excessively, threatening the coastal environment posing human and animal health threats. HAB toxins may kill fish or shellfish directly and can lead to illness and death in some marine birds and mammals, including humans.

Scientists deploy an Environmental Sample Processor to detect toxic Alexandrium blooms in the Gulf of Maine. (Credit: NOAA)

Scientists deploy an Environmental Sample Processor to detect toxic Alexandrium blooms in the Gulf of Maine. (Credit: NOAA)

During blooms, shellfisheries are monitored for HAB toxins by state agencies, and, when necessary, are closed to protect human health.  Because of the monitoring, commercially available shellfish are safe to eat.  Even blooms that are not toxic can cause damage by suffocating fish, blocking light from bottom-dwelling plants, or depleting the oxygen in the water.

Hypoxia, or low oxygen, can occur naturally but is often caused by poor water quality from human activities, such as excessive nitrogen or phosphorus pollution from agriculture fertilizer runoff, sewage, urban runoff, or other practices. Today, more than half of the studied U.S. estuaries have experienced hypoxia.

The National Centers for Coastal Ocean Science delivers ecosystem science solutions for NOAA’s National Ocean Service and its partners, bringing research, scientific information and tools to help balance the nation’s ecological, social and economic goals.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.

Making money as a service provider is dependent on finding customers. Entrepreneurs see the opportunity in hosting and start a hosted services business, build a strong technical implementation following the Citrix Service Provider Reference Architecture, but now need more sales and marketing help to grow the business. Managed service providers want to add Desktops-as-a-Service (DaaS) to their offerings, but are unsure which current customers might purchase.

But there is good news! The Citrix Service Provider program isn’t just about technology licenses. The Citrix Service Provider Partner Program provides end-to-end technology, business, and marketing support to thousands of service provider partners worldwide.

We’ve been tracking the trends around service provider marketing– and watching technology industry marketing changes. Here are 3 key marketing trends for what’s in and what’s out for DaaS marketing.



During September, National Preparedness Month, the Austin Joint Field Office is releasing a series of stories highlighting FEMA’s support of Texas communities as they take steps to reduce or eliminate long-term risk to people and property.

AUSTIN, Texas – Austin’s city leaders have seen disaster before and understand the folly of waiting and hoping one will never again hit this area.

With that in mind, they partnered with Travis County and the Central Texas Chapter of the American Red Cross to develop “Disaster Ready Austin.”  Coordinated by the city of Austin’s Office of Homeland Security and Emergency Management (HSEM), the purpose of the initiative is to educate and empower residents to be prepared for emergencies and disasters.

The vision is a whole-community approach to disaster preparedness education in the city of Austin. “Our basic message to [residents] is to protect themselves,” said Jacob Dirr, public information and marketing officer of HSEM’s Community Preparedness Programs. “The goal is to educate Austin residents on basic preparedness for all types of hazards, including first aid tips and what to do in case of flash floods, wildfires, severe weather, pandemic flu or accidents involving hazardous materials.”

Online resources, such as contact cards and emergency kit checklists are offered in English and Spanish at Homeland Security and Emergency Management | AustinTexas.gov . The HSEM Community Education and Outreach team members take advantage of scheduled meetings, such as Parent Teacher Associations (PTA) gatherings at the schools, where they offer presentations in English and Spanish.

Dirr notes that in some areas young kids, and parents, in the Austin community understand Spanish more than English.

Other audiences include Boy Scout groups, elderly care facilities, fairs, kids’ summer programs, area employers, community groups and school events.

One component of their community outreach at events is “Ready Freddie,” a character included in a children’s activity book called “Too Prepared to Be Scared,” which Dirr said is popular with parents and children. Featuring puzzles, games and animated figures to help get the preparedness message across, the booklet also has a certificate of appreciation children can receive when they finish.

“It’s full of colorful disaster-related advice such as information on developing an emergency supply kit, having an emergency plan and keeping pets safe,” Dirr said.

One of the biggest events attended by HSEM staff, including Dirr dressed in a life-size Ready Freddie mascot costume, was the “Back to School Bash” held at the downtown convention center. More than 100 vendors participated, with attendance exceeding 11,000.

To learn more about how cities and towns across Texas are building stronger, safer communities visit Best Practice Stories | FEMA.gov.


FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. 

The 8.3-magnitude earthquake that struck off Illapel, Chile, on Thursday morning (Australian time) has once again highlighted the importance of tsunami warning systems in the world’s oceans. The earthquake occurred along the interface of the Nazca and South American Plates in Central Chile.

Latest reports indicate that five people have been killed and millions evacuated.

A sudden slip along this fault zone led to movement of the sea bed. This in turn generated a tsunami with 4.5-metre waves reported on the Chilean coast.

A Pacific-wide tsunami alert has been issued by Pacific Tsunami Warning Center (PTWC) in the United States based on earthquake data from the United States Geological Survey. The PTWC is one of a worldwide network of tsunami warning centres.



WASHINGTON– The Federal Emergency Management Agency (FEMA) and HOPE Coalition America (HCA), the emergency preparedness and financial recovery division of Operation HOPE, signed a memorandum of agreement yesterday renewing their 11-year collaboration to promote financial preparedness and support for recovery after emergencies and disasters. The renewal of this collaboration took place during National Preparedness Month, a nationwide, month-long effort hosted by the Ready Campaign, encouraging households, businesses, and communities to prepare and plan for emergencies.

“Being financially prepared before, during, and after a disaster can help families and communities recover faster when disaster strikes,” said FEMA Administrator Craig Fugate. “This memorandum of agreement will help to make our communities more financially secure and our nation more resilient.” 

The memorandum of agreement outlines a wide array of collaborative actions between FEMA and Operation HOPE, including efforts to provide pre-disaster financial education materials and information to communities, establishing and updating procedures to provide free financial guidance, and case management to survivors in the event of a major disaster or emergency, and efforts to recruit and train volunteers to provide financial preparation and recovery guidance to survivors. 

“Operation HOPE helps individuals, families and small businesses regain their financial health and economic stability after a natural disaster or national emergency,” said Operation HOPE Founder, Chairman, and CEO John Hope Bryant. “We’re pleased to renew our partnership with FEMA and assist their efforts to help Americans be better prepared for adverse events. As such, HOPE Inside locations nationwide will now include access and resources offering HCA services.”

Over the past several years, FEMA and HCA have leveraged resources from each other to help individuals and families prepare for disasters, or recover from disasters in the shortest possible time. FEMA has also partnered with Operation HOPE to encourage individuals, families and businesses to collect and safeguard the critical documents they will need to help them start the process through the Emergency Financial First Aid Kit (EFFAK). The EFFAK is a resource for financial preparedness, providing step-by-step instructions on the protection of personal assets and financial information to reduce vulnerability after a disaster. This simple tool can help Americans identify and organize key financial, insurance, medical, and legal records, and is available at www.ready.gov/financial-preparedness.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Security protocols are put in place to protect business interests. But are these security protocols also hurting your business?

Having a good security infrastructure in place is absolutely necessary in today’s work environment, but a new Dell study shows that good security has a negative impact on employee productivity.

Even worse, it appears that employees don’t like the restrictions imposed by security protocols because too many of them are using workaround strategies to avoid them; this, said 70 percent of the respondents, is creating the greatest security risk.



One of the sharp contrasts we can draw this week is between BMC and HP. In a weird way, BMC looks a lot like it may be what HP’s board wanted: a software pure-play. But BMC has around 7,000 employees, and HP has around 300,000. This week, HP announced it would be cutting an additional 30,000 employees on top of the 55,000 already cut, approaching one-third of the workforce. This comes after management split the company in two, which should have resulted in the need to up the staff for common services that aren’t common anymore.

One of the interesting contrasts is that over the last decade or so, BMC has largely had one CEO, Bob Beauchamp, while HP has had a string of them, starting with Carly Fiorina, who came from telecom and marketing, and ending with Meg Whitman, who arrived after a failed bid for California governor and being replaced at a far smaller eBay.



One of the biggest challenges in using an emergency notification service is keeping up with your contacts—especially if your organization is large or has high-volume turnover. How can you make sure your recipients aren’t missing vital communications without consuming significant time and resources?

The solution is simple. Let your notification recipients sign up for alerts and maintain their own contact information. Now your administrators can spend less time managing contact information and more time focusing on their main role.

A good notification service, like Send Word Now, will offer a Self-Registration tool or portal that facilitates data entry by your recipients. They simply create a password and provide their contact information all from a single web address. The portal even allows them to choose message preferences and provide the contact points at which they wish to be reached. Most importantly, they can update their contact information at any time—or when instructed to do so. We’ve noticed that customers want to make contact data management easy while focusing on business continuity, safety, and day-to-day tasks.



Friday, 18 September 2015 00:00

Introducing the Masters of Disaster Podcast

I am happy to announce that the podcasting community on compliance, ethics and risk increased by an estimated 100% in September with the launch of the new podcast Masters of Disaster.   Earlier this month I was honored to have the other half of the ethics and compliance podcasting community, Tom Fox, graciously interview me about the launch of my new podcast on his podcast, the FCPA Compliance and Ethics Report.

Why “Masters of Disaster”? It is more fun to say than the “Risk, Ethics and Compliance Podcast,” of course. The podcast features interviews with masters in the fields of risk, ethics and compliance – all areas that can become disasters if not managed well. The podcast also includes interviews with people who work on making professionals more influential or healthy in their high-stress jobs.



Data centres with volumes above 80 TB may be more cost-effective if they use flash memory in the place of traditional hard disk drives (HDDs), according to one expert.

Eric Burgener, a research director at International Data Corporation (IDC), made the claim in a recent whitepaper sponsored by Violin Memory, Computer Weekly reports.

Flash memory – which is used in smartphones and tablets, as well as solid-state drives – is demonstrably faster and more efficient than HDDs. However, the cost per GB is higher, at around $1.25 (£0.81) rather than $0.70 (£0.45), Mr Burgener said.

Nonetheless, his analysis revealed that at data centre volumes exceeding 80 to 90 TB, the other advantages of flash memory start to justify the extra outlay.



Just 27% of small businesses in the UK have a business continuity plan in place, compared to 68% of medium sized organisations and 75% of large organisations, demonstrating that smaller organisations are not taking the threats to their operations seriously. Small businesses are not exempt from the possibility of a cyber attack, supply chain failure or weather related incident, and often have less capacity to absorb the costs of these incidents should they occur, meaning the ultimate impact can be even more devastating.

UK organisations are not alone however, as the findings of this study echo a recent survey in the US which also found that small businesses were not prepared for a disaster.

The sixth annual Data Health Check report, published by Databarracks, also revealed that 73% of the small businesses questioned admitted they hadn't tested their plan in the last 12 months, with nearly half not planning to within the next year. The report highlighted that disaster recovery testing had a huge impact on how confident organisations are in their DR solution. Of those organisations that had tested their DR plans within the last year, 58% were 'very confident' in them, with this figure falling to just 28% for non-testers.

The theme for Business Continuity Awareness Week 2015, run by the Business Continuity Institute, was testing and exercising and the key message was that a plan that has not been exercised is simply not a plan – you don’t want to find out during a crisis that it is not fit for purpose. Testing and exercising is a fundamental part of business continuity and must not be excluded from the process.

Oscar Arean, technical operations manager at Databarracks, commented: "It's not surprising to find that small businesses are less likely to have a BCP than larger businesses. What is worrying is the lack of improvement we've seen for small businesses in the last 12 months. Sometimes it takes a prolonged period of downtime or a substantial data loss for a business to realise the importance of a robust DR solution, but it shouldn't come at that cost. We need to see a culture shift and perhaps some of that responsibility falls to the service providers as well as the customers. DR providers need to educate organisations on the importance of disaster recovery planning and testing, and demonstrate how vulnerable they are if this isn't done. Disaster recovery isn't a luxury insurance policy anymore, it's absolutely essential for businesses no matter what size."

The Federal Emergency Management Agency has extended the deadline for flood insurance policyholders to submit their Hurricane Sandy Claims for review. The last day to submit claims is now Oct. 15, 2015.

The U.S. Department of Housing and Urban Development announced today that any additional flood insurance proceeds up to $20,000 will not be treated as duplicative. Federal agencies cannot provide disaster assistance for losses covered by insurance. HUD’s announcement stated that “this will eliminate the need for HUD grantees to reclaim assistance from these households or to repay those funds through non-federal sources. To date, three out of four National Flood Insurance Program (NFIP) claimants have received less than $20,000 in additional compensation from FEMA and will not face any possible repayment.”

Roy Wright, FEMA’s Deputy Associate Administrator for Insurance and Mitigation, encouraged policyholders to call FEMA and request a review if they believe their claims were underpaid for any reason. As of Sept.14, nearly 14,000 policyholders have requested reviews of their Sandy flood insurance claims.

“FEMA remains committed to making sure that every policyholder gets every dollar they are owed under their flood insurance policy. Already, thousands of policyholders have contacted us to have their claims reviewed and we have begun providing funds to those who were due additional payments on their claim,” Wright said.

“We are hopeful that HUD’s action to provide relief to the vast majority of those who are concerned about potential duplicative benefits will encourage even more policyholders who may have been initially reluctant to enter the process to do so,” Wright said. “In light of HUD’s decision to simplify this review and provide relief, we are extending the claims review deadline until October 15th.  We hope by extending the deadline we are addressing any remaining concerns some may have about entering the claims review process. The review process we have established is designed to be simple, fair, and accessible without paid legal assistance. FEMA is dead set on restoring trust in this important program and no one should be discouraged from having their claim reviewed.”

Policyholders can call the NFIP’s Hurricane Sandy claims center at 866-337-4262 from 8 a.m. to 8 p.m. Eastern Daylight Time (EDT), Monday through Friday to request a review.  It is important to have your policy number and insurance company name when you call.

Policyholders also can go online to www.fema.gov/hurricane-sandy-nfip-claims to download a form requesting a review. The downloaded form can be filled out and emailed to FEMA-sandyclaimsreview@fema.dhs.gov or faxed to 202-646-7970 to begin the review process. For individuals who are deaf, hard of hearing, or have a speech disability and use 711 or VRS, please call 866-337-4262.  For individuals using a TTY, please call 800-462-7585 to begin the review process.     


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Thursday, 17 September 2015 00:00

Multiple Approaches to Container Scalability

Few enterprises have made serious inroads into the emerging field of container virtualization, but already there is growing concern that the technology might not be as effective as advertised in supporting advanced applications and microservices – at least not yet.

At the moment, the big issue is scalability. Docker, the leading container developer, has made no secret of its desire to incorporate greater scalability on its platform, primarily the ability to enable more efficient networking between large numbers of containers. To that end, the company has offered a number of orchestration and management tools through joint development projects with companies like Red Hat, Amazon and IBM.



Thursday, 17 September 2015 00:00

Blockchain: It Really is a Big Deal

By Arvind Krishna

Over the past two decades, the Internet, cloud computing and related technologies have revolutionized many aspects of business and society. These advances have made individuals and organizations more productive, and they have enriched many people’s lives.

Yet the basic mechanics of how people and organizations forge agreements with one another and execute them have not been updated for the 21st century. In fact, with each passing generation we’ve added more middlemen, more processes, more bureaucratic checks and balances, and more layers of complexity to our formal interactions–especially financial transactions. We’re pushing old procedures through new pipes.

This apparatus–the red tape of modern society–extracts a “tax” of many billions of dollars per year on the global economy and businesses.



The number-one rule of safely downloading apps is to use the official app marketplace, whether it is the App Store or Google Play, or a vendor’s store.

That’s why the news from Bitdefender researchers is so alarming. They discovered sophisticated CAPTCHA-bypassing Android malware in Google Play apps. The piece of malware itself was discovered in 2014, but it was distributed through those third-party sites. According to a release, this is the malware’s first occurrence in the official Google Play store, as it appears that the malware developers discovered new ways of packing it into seemingly legitimate apps that can bypass Google’s vetting system.

The malware takes advantage of the authentication system. As Tech City News explained:



Thursday, 17 September 2015 00:00

ASIS releases new Risk Assessment Standard

ASIS International has published a new standard, which it has developed in conjunction with RIMS.

Entitled Risk Assessment ANSI/ASIS/RIMS RA.1-2015, the standard “provides guidance on developing and sustaining a coherent and effective risk assessment program including principles, managing an overall risk assessment program, and performing individual risk assessments, along with confirming the competencies of risk assessors and understanding biases.”

Risk Assessment provides guidance for:

  • Establishing a risk assessment program and conducting individual risk assessments consistent with ISO 31000:2009 Risk management - Principles and guidelines, as well as the Committee of Sponsoring Organizations of the Treadway Commission (COSO) Enterprise Risk Management framework;
  • Conducting risk assessments for risk and resilience based management system standards for the disciplines of risk, resilience, security, crisis, business continuity, and recovery management.

More details.

Thursday, 17 September 2015 00:00

Why Your BIA Method Matters

In this paper, Stephen Massey describes why BIA is so important in the establishment of an effective BCMS and which methods yield the most efficacies; how organizations must avoid confusing efficiency for efficacy and; why the BIA process must be treated as a learning and development exercise.


Organizations wishing to implement robust business continuity programmes have a requirement to conduct business impact analysis (BIA). Given the complexity of the BIA process and, the limited resources available to collect data, there is a requirement to identify efficacious collection methods to support business units and business continuity practitioners in completing the task so that effective risk assessment and business continuity planning can take place. However, which method is the most efficacious and what are the factors affecting efficacy? This paper attempts to answer this and the following questions using the scientific method:



Wednesday, 16 September 2015 00:00

3 Pros and Cons of Exercising

In this short blog, we have identified three advantages and three disadvantages of tabletop, functional, full-scale, and corporate war game exercises.

These are not the top three, or the bottom three in terms of our findings. They are simply a few pros and cons of conducting the main types of exercises.



Wednesday, 16 September 2015 00:00

How to Stay in Business Through a Disaster

Disaster can strike at any time. While crisis manuals are lengthy, they should have three key pillars.

On 11th March 2011, the public announcement protocols gave Kazunobo (informally known as “Kaz”) and his investment banking team approximately one minute of advanced warning before the magnitude 9 earthquake whose epicenter was some 450km away began to shake his Tokyo office. For Kaz a swaying office was by no means unusual but he was now experiencing the largest earthquake to hit Japan since records began. One which shifted the country’s main land mass 8ft to the east and jolted the earth on its axis by as much as 25cm.

The significant damage to the country caused by these violent tremors was soon eclipsed by the destruction brought by the ensuing tsunami. The wave, sometimes reaching 40m tall, washed more than 10km inland killing 16,000 people and displacing more than 200,000. 4.4 million households were left without electricity and 1.5 million without water. The Fukushima nuclear reactor melted down.

Read more at http://knowledge.insead.edu/blog/insead-blog/how-to-stay-in-business-through-a-disaster-4239#Rjylxi2tRgg1wk26.99
Wednesday, 16 September 2015 00:00

Data Programs Revolutionize Cincinnati Government

There’s a dream out there among those in the top levels of local government that one day they will be able to lead with clear, concise data from every department they oversee. For many, this dream will die a slow and agonizing death as efforts to break down, bureaucratic siloes fall short and daily operational demands outpace the potential for change.

It’s nothing against these dreamers, of course, it’s just that governments, like people, get comfortable and fear change — even if it’s for the better.

But even over a bad telephone connection, Harry Black’s tone and tenor are enough to convince you that he lacks the complacency so often found in others of his station. He comes across as a man with a crystal-clear vision for his city.



Wednesday, 16 September 2015 00:00

The Myth of Departmental Continuity Plans

Since the early days of Business Continuity Planning, many organizations have chosen to focus efforts on “Worst Case” and “Hole-in-the-ground” scenario planning, and Departmental Continuity Plans.  The value of Departmental Continuity Plans is a myth. Planning for an artificial, organizational construct ‘the Department’ shifts emphasis from the real reason BCM exists: to resume delivery of the organization’s critical Products & Services.

The effort expended to create a Departmental Business Continuity Plan often has two negative outcomes:



Wednesday, 16 September 2015 00:00

What the Industry Gets Wrong About Security

“The technology industry has grossly over-hyped the value of its products and built empires around consulting and spreading fear.”

Provocative stuff, right? No question. But it’s even more so when you consider that preceding comes from a thought leader whose company is making a major push into security. It’s true. The company is LogicNow and the author of the above quote is Ian Trump, LogicNow’s Security Lead.

On Sept. 10 in Washington, MSPmentor sat with Trump, where he was in town for the company’s fifth summit for MSP partners in the Americas.



Security for the Internet of Things (IoT) is vitally important, but challenging to provide. Computerworld’s Kenneth van Wyk pointed to three traits of human nature as obstacles to building security into the IoT: naïveté, ignorance and laziness. He may well need to add a fourth to the list: competition.

The problem, according to van Wyk, is that companies need to position themselves quickly to take advantage of the money that is on the table and the incredible amount of it that will be thrown into the pot over the coming years.

van Wyk says that though we may understand why IoT developers were naïve, ignorant and lazy about some elements of security, it is certainly not forgivable: 



When you think “community,” what comes to mind? Maybe you immediately think of neighbors and friends who live nearby, or perhaps local businesses, churches, civic organizations and others. What about some of your regular stops around your community such as your pharmacy where you fill your prescriptions or buy over-the-counter medicine? Most people have trusted community partners they know well and with whom they interact regularly in everyday life. In fact, some of these same community partners are working with local, state, and federal public health planners to help your communities prepare for emergencies.

Pharmacists standing over table of clipboards.

Pharmacists set up a screening station in a recent exercise to determine which medicine a person should receive to counteract exposure to anthrax.

In June 2015, a Costco warehouse in Potomac Mills, VA, partnered with Prince William Health District (Virginia Department of Health) to show how a private business can step up to help its community in an emergency. In this particular exercise, Costco regional pharmacy staff exercised a local plan to dispense medication – actually empty training bottles – to nearly 200 public volunteers as part of an open, or public, point of dispensing (POD). The scenario was based on a large-scale anthrax attack that would require mass dispensing of antibiotics from the Strategic National Stockpile (SNS). In an emergency where many people were exposed to anthrax, these antibiotics would help prevent people from becoming sick.

Volunteers wait  to receive their “pill bottles” in a recent exercise with Costco and Prince William, Va. Health District  to test a public dispensing plan in an emergency.

Volunteers wait to receive their “pill bottles” in a recent exercise with Costco and Prince William, Va. Health District to test a public dispensing plan in an emergency.

“Public health and the private sector can share resources and work together as a community to reach one goal,” said Patrick Ashley, emergency preparedness and response coordinator for Prince William Health District. “We have realized that government cannot do everything on its own – and shouldn’t. The success of this exercise comes from having engagement on all sides. Costco came to the table and has been a great partner.”

This public/private dispensing pilot, facilitated by the Centers for Disease Control and Prevention (CDC), demonstrates how large private retailers can partner with state and local public health departments to dispense medications to the public in an emergency. Costco has partnered with public health to operate both closed PODs, which would serve its own employees and their families, and public PODs open to the larger community.

“Costco has a history of serving the community,” said Christopher Loving, Costco regional pharmacy supervisor in Virginia. “This was a great opportunity for us to show our region’s pharmacy managers how this kind of event would work. The POD exercise at Potomac Mills was a huge success.”

At CDC, we know that all response begins locally, and a resilient community is simply one that has made itself ready to use all of its available resources to plan for, respond to, and recover from an adverse event. The real key to creating resilient communities is to strengthen day-to-day activities that help keep the community healthy and thriving.

table with pharmacists packing bags with medicine bottels

Regional Costco pharmacists in Virginia exercise a local dispensing plan to respond to an emergency that would result in wide-spread exposure to aerosolized anthrax.

“The SNS is the nation’s repository of medications and medical devices for responding to public health emergencies, and we focus our efforts on helping build resiliency by ensuring that everyone in a community has access to the life-saving material we can provide,” said Greg Burel, director of CDC’s Division of Strategic National Stockpile. “The National Stockpile’s supplies cannot stop significant health problems after a disaster if communities are not resilient, so we work to facilitate relationships, train people and create strong partnerships between public health and the community. By including private businesses like Costco in these planning efforts, we are able to reach more people who rely on and trust their community partners.”

From the astute healthcare provider who recognizes that a disease is emerging in a community that could pose a public health threat, to the company that wants to make sure its employees and their families are safe, all of us have an important role to play in making our communities resilient. In an emergency, the whole community will be affected. If public health jurisdictions and the private sector can collaborate on planning and partnerships in advance to make that community resilient before something bad ever happens, we are all ultimately safer.

Wednesday, 16 September 2015 00:00

Design Thinking in Compliance

In many ways the migration from Chief Compliance Officer (CCO) 1.0 to 2.0 and beyond is more than simply about the technical aspects of a CCO to the internal and external delivery of a compliance solution by the compliance function. The Department of Justice (DOJ) and Securities and Exchange Commission (SEC) both have consistently articulated that a Foreign Corrupt Practices Act (FCPA) anti-corruption compliance program should be an evolving solution, dynamic not static. Compliance as a business solution to a legal issue and must also evolve to meet the ever-changing business dynamics of a progressively globalized marketplace and international enforcement of anti-bribery laws. To think that the drafters of the FCPA foresaw every business challenge that would appear over the intervening 35+ years belies the path of legal and commercial developments in that time frame.

One of the areas of development that can be of use to the compliance practitioner is design thinking in your compliance program. This issue was explored in a series of articles in the September issue of the Harvard Business Review (HBR). In an article by Jon Kolko, entitled “Design Thinking Comes of Age, he posited, “the approach, once used primarily in product design, is now infusing corporate culture.” For the CCO or compliance practitioner this means recognizing you have customers, i.e. your employees, and third parties that may fall under your compliance program. All of these groups have a user experience in doing compliance that may be complex and interactive. As a CCO 3.0 or further, you will need to design a compliance infrastructure to the way people work so that doing compliance becomes burned into the DNA of a workforce.



(MCT) - Disasters can happen at any time.

That’s why officials say it’s important to always have a plan and to be prepared.

September marks the 12th annual National Preparedness Month. This annual campaign is hosted by the Federal Emergency Management Agency and the Department of Homeland Security.

National Preparedness Month helps to educate the public and helps people prepare for emergencies that may occur in the community, including natural disasters like floods, extreme winters and power outages.



We know our technology is unrivaled and as part of our “work better, live better” mantra, we want our partners to share our tech far and wide. One of the ways we do that is baking in partner incentives as part of our global plan to be the best business for Citrix partners to work with.

We have a variety of incentives designed to drive your success and make your business more profitable – take advantage of them!



Wednesday, 16 September 2015 00:00

Enterprise Risk Lagging Globally, Study Finds

Despite a widening range of risks faced by organizations globally, less than 35% of companies say they have an enterprise risk management (ERM) plan in place. What’s more, 70% would not describe their oversight as mature, according to the Chartered Global Management Accountant (CGMA) report Global State of Enterprise Risk Oversight 2nd Edition.

The study found that 60% of boards of directors globally are pressuring their companies to increase involvement of senior management. The U.S. is lagging in some areas, with only 46% of its boards assigning risk oversight responsibilities to a committee compared to 70% globally.



Wednesday, 16 September 2015 00:00

Risk Has a Shape

One of the central tenets of risk management is the idea that we understand “risk.”  Most definitions of risk management include terms such as assessment, evaluation, identification, control, transfer, reduction, retention and so on to describe what should be done to risk to protect the firm, patient or enterprise from bad outcomes.  The benefits ascribed to risk management include the achievement of a firm’s business goals, better patient care, improved decision making, risk-adjusted returns on capital and a host of superlatives attributed to the proper risk framework or leading practice.

Very few risk management definitions actually explain how to accomplish these results with any degree of specificity.  Instead, the definition includes vague descriptions of activities that lead to risk management. For example, a hospital definition of risk management: “The constellation of activities—planning, organizing, directing, evaluating and implementing—which are involved in reducing the risk of injury to patients and employees, as well as property damage or financial loss in a health care facility.”  This definition, like many others of similar ambiguity, are no more than “trial and error” disguised as risk management.  What is clear is that we may not truly understand risk as well as we think we do!



Every MSP offering a cloud-based file sharing service knows that few other IT infrastructures out there can compete with the cloud as far as security goes. In fact, advanced security features, such as multi-factor authentication and end-to-end encryption, are almost taken for granted in the world of cloud computing.

No doubt this is good news. However, cloud services should not make the mistake of taking their security for granted if they don’t want to witness a catastrophic fall someday. Take the recently released report on data breaches by Ponemon, which clearly finds that costs related to attacks against IT services are increasing.

While the report is more of a commentary on IT security in general, it does hint that even though the cloud hasn’t seen any major security breaches yet, MSPs cannot afford to become complacent. Remember, there is a lot more creativity dedicated to cracking cloud security than securing it. So the more proactively you defend yourself, the lesser the likelihood you will witness an attack. Security, in other words, is always a work in progress.



CVS has confirmed its photo website, CVSphoto.com, was breached this summer.

And as a result, the pharmacy chain tops this week's list of IT security newsmakers to watch, followed by Excellus BlueCross BlueShield, ESET and the Atlantic Council.

What can managed service providers (MSPs) and their customers learn from this week's IT security newsmakers? Find out in this edition of IT security stories to watch:



Tuesday, 15 September 2015 00:00

The MSP Private Cloud Opportunity

More than 70 percent of companies have implemented private cloud solutions, and 71 percent of them report easier application management as a result, according to new research from Aberdeen Group. Private clouds also have reduced IT complexity for 46 percent of business and accelerated application deployment for 45 percent of them.

Yet, Aberdeen finds, some companies remain skeptical of private clouds, fearing increased costs and complexity. But those businesses, Aberdeen warns in its July 2015 report, “A Simple Path to Private Cloud,” are bound to continue paying too much for application deployment and administration. And that limits their ability to innovate and leverage future technology advances.

Aberdeen posits that fears over complexity and cost are misplaced. Businesses instead should endeavor to assess and understand their critical application needs, and work with a cloud services provider to fulfill those needs.



Tuesday, 15 September 2015 00:00

Achieve Operational Efficiency with DevOps

The traditional model of deploying new software and services is obsolete and unable to meet today’s fast-paced and evolving business demands. 

Consumers are demanding access to applications and services in real-time while expecting an unparalleled user experience. The convergence of web and mobile technologies has also accelerated the need for continuous improvement of applications and Enterprises. Needless to say, Enterprises are struggling to keep up.

DevOps may provide the answer, as it accelerates time to market for new apps and services, and streamlines application deployment while achieving operational efficiency. From a CIO’s perspective, adoption of DevOps is transformational. Continuous integration is not just about adopting new tools, nor it is just about a new IT methodology. It is about an enterprise wide shift on how IT is organized.



WASHINGTON—The U.S. Department of Homeland Security’s Federal Emergency Management Agency (FEMA), in coordination with state, local, and tribal emergency managers and state broadcasters’ associations, will conduct a test of the Emergency Alert System (EAS) on Wednesday, September 16, 2015, in six New England states.  The test will begin at 2:20 p.m. Eastern Daylight Time (EDT), and will last approximately one minute. 

The voluntary EAS test will be seen and heard over many radio, television and cable stations in Maine, Vermont, New Hampshire, Connecticut, Massachusetts, and Rhode Island. The EAS test might also be seen and heard in upper New York State if the public normally receives any broadcasts from nearby New England stations.  The word “national” will be added to the test message: “This is a national test of the Emergency Alert System. This is only a test.” 

“The EAS test message will be sent to radio and television stations using a National Periodic Test code that sounds and appears like the regular monthly EAS tests conducted by state officials and broadcasters,” said Roger Stone, Acting Assistant Administrator of FEMA’s National Continuity Programs. “FEMA is working to specify a method for conducting periodic nationwide EAS tests using the National Periodic Test code in the near future.”

The test is designed to have limited impact on the public, with only minor disruptions of radio and television programs that normally occur when broadcasters regularly test EAS in their area. There is no Federal Commissions Commission regulatory liability for stations that choose not to participate. 

The test will assess the operational readiness of FEMA’s Integrated Public Alert and Warning System (IPAWS) infrastructure that will distribute the national-level EAS test message to radio, television and cable operations from origination to reception by public.  It will verify the functionality of EAS stations to receive and broadcast a national test message.  The test requires that radio and television stations make a minor configuration change to their station EAS equipment to receive and process the National Periodic Test code message from the IPAWS system.

In 2007, FEMA began modernizing the nation’s public alert and warning system by integrating new technologies into existing alert systems.  The new system is known to broadcasters and local alerting officials as the Integrated Public Alert and Warning System or IPAWS.  IPAWS connects public safety officials, such as emergency managers, police and fire departments to multiple communications channels to send alerts to the public when a disaster or other imminent danger occurs. 

More information on the Public Alert and Warning System and Wireless Emergency Alerts (WEA) is available at www.fema.gov/ipaws or www.ready.gov/alerts. For more information on IPAWS, visit www.fema.gov/media-library/assets/documents/31814.


FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

(MCT) - Having enough water on the Missouri River hasn’t been a major problem this year. At the end of August, precipitation was 108 percent of normal, even with below-normal snowfall last winter. Currently, the U.S. Army Corps of Engineers has 60.1 million acre feet of water stored behind the six dams it operates along the Missouri River.

However, this is not a time for those who live along the river or use the river to be complacent.



Tuesday, 15 September 2015 00:00

As Technology Gets Smaller, Risks Get Bigger

Microelectronics is changing the way we live, work and do business. With circuitry thousands of times smaller than a human hair, microelectronics has become the brains behind almost every business. But shrinking technology makes equipment more vulnerable to breakdowns, especially when it’s portable and fragile. To manage the risk, you need to keep up with these evolving exposures to protect your organization from loss.

Insurance is changing as well, to reflect this new technology. Think of all the equipment that relies on micro-circuitry. From building systems to communications, if it uses electricity, it likely operates with tiny transistors and microprocessors. Our claims data shows that micro-circuitry is prone to break down and is difficult to repair.

Yet, most property coverage does not cover equipment breakdowns and typical equipment breakdown insurance requires proof of physical damage. That can leave a business without coverage for repair or replacement, business interruption and data loss caused by today’s technology losses, unless the policy specifically covers microelectronics failures.



Tuesday, 15 September 2015 00:00

New Challenges Arise for Banking Security

It’s been a while since we’ve talked about banking security, but it appears new banking malware is making the rounds.

IBM’s X-Force team discovered the banking Trojan, which is primarily targeting the Japanese financial market. It has been named Shifu – the Japanese word for thief – and as IBM’s Security Intelligence blog reported:

The Shifu Trojan may be a new beast, but its inner workings are not entirely unfamiliar. The malware relies on a few tried-and-true Trojan mechanisms from other infamous crimeware codes. It appears that Shifu’s internal makeup was composed by savvy developers who are quite familiar with other banking malware, dressing Shifu with select features from the more nefarious of the bunch.



Automating the data center is one of those things that evokes conflicting emotions in enterprise executives. After all, who wouldn’t want a virtually hands-free data ecosystem in which everybody’s needs are satisfied at a moment’s notice? Then again, no one, not even the people building the automation stacks, believes such functionality is realistic.

But while it is true that automation is not likely to produce Star Trek-esque data service any time soon, the fact is that today’s platforms can and do improve data processes quite a bit, and implementation, particularly on virtual and abstract architectures, is not nearly as cumbersome as it was just a few years ago.



DENTON, Texas – “Don’t Wait. Communicate. Make Your Emergency Plan Today.” That’s the message emergency managers are sharing with people all over Texas and beyond during the month of September.

September is National Preparedness Month and the Federal Emergency Management Agency’s (FEMA) Region 6 Office is urging everyone to take steps to make a plan and know what to do during an emergency.

Whether you deal with the possible threats of flooding, wildfires, hurricanes or power outages, the preparedness steps to take are the same. They include:

•    Knowing your risk for where you live;
•    Having an individual and family preparedness plan in place;
•    Practicing that plan;
•    Putting together an emergency kit with water and non-perishable supplies to last for at least three days for you, your family and your pets;
•    Ensuring that your contact list is up-to-date for people you may need to reach out to during a disaster; and
•    Establishing alternative methods of communication in case traditional means are not available.

Additionally, September 30 is National PrepareAthon! Day. You are encouraged to participate by doing a simple, specific action or activity to improve your preparedness and your family’s preparedness; or it can be something more elaborate that involves your neighborhood, your place of worship, your entire workplace or your community.

Visit www.ready.gov or www.ready.gov/prepare for more information on America’s PrepareAthon! You can find tools to stage your own emergency preparedness drills, as well as register any preparedness activities for you or your community.  


FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. Follow us on Twitter at http://twitter.com/#!/femaregion6, and the FEMA Blog at http://blog.fema.gov.

When it comes to cybercrime, the UK has the dubious honour of being the most commonly targeted country in the world, according to a new study.

Carried out by data security company ThreatMetrix, the research found that British firms are attacked 50 per cent more frequently than their counterparts in the US, and by hackers located as far afield as Nigeria and Mexico.

It also found that ecommerce is currently one of cybercrime’s top targets, with attacks up 20 per cent in the second quarter of 2015, and noted that recent growth in mobile use has given rise to fresh opportunities for fraudsters to conduct spoofing attacks.

Dr Stephen Moody, solutions director for EMEA at ThreatMetrix, commented: “The more businesses and consumers turn to the digital space to store and manage their financial information, the more fraudsters will be on high alert.”

The study’s findings highlight how UK companies must manage key data effectively to protect their customers’ privacy and insure themselves against data loss.

Kroll Ontrack provides software for MS Exchange and SharePoint, solutions for permanent data erasure, and services for tape archives, as well as data recovery.

From:: http://www.krollontrack.co.uk/company/press-room/data-recovery-news/uk-tops-list-of-most-attacked-countries-in-cybercrime-study379.aspx

Tuesday, 15 September 2015 00:00

Gauging the Impact of Reputational Risk

The following article is part of a continuing blog series that will explore ideas, concepts, discussions, arguments and applications associated with the field of enterprise and strategic risk management.

In my previous article, I made the point that the public discussion of reputational risk lacks a set of common standards or definitions. This lack of consistency allows organizations to interpret or define the concept of reputational risk in very different ways. For some, reputation is beginning to be viewed as something like the “risk of risks” in the same way people are starting to discuss the concept of the “internet of things.” I questioned whether reputation or brand is actually a risk or a residual event stemming from other extenuating risk domains or actions.

Upon further reflection and discussions with academics and risk professionals who are thinking carefully about this issue, I would go further now to suggest that reputation or brand risk involves perceived or real human behaviors that are, to some extent, measured against societal, economic or moral standards. The adherence or deviation from established standards generates the basis for the risk, and the variability from the standard influences the duration of the outcome.



Monday, 14 September 2015 00:00

Where Next for Government Cybersecurity?

Everyone in America remembers where they were on September 11, 2001. As we think back over the years, there have been physical attacks thwarted and numerous close calls.

Over the past few years, the number of serious online incidents impacting national security has skyrocketed. We live in a far different online world today than most people imagined back when the Department of Homeland Security (DHS) was formed back in 2003.

And while the DHS leadership team has changed, the cyberthreat landscape has also grown dramatically, along with a new determination to strengthen our digital defenses. Meanwhile, The OPM data breach and several Whitehouse data breaches have propelled cybersecurity to the top of national security priority list.

Recently, DHS Secretary Jeh Johnson appointed Dr. Andy Ozment to the role of Assistant Secretary of the Office of Cybersecurity and Communications (CS&C) within the National Protections and Programs Directorate (NPPD). As the DHS website points out, Dr. Ozment “oversees a budget of almost $930 million and leads a Federal employee workforce charged with enhancing the security, resilience, and reliability of the Nation’s cyber and communications infrastructure.”



Now that IT security is core to almost any managed service MSPs should take note of the fact that Symantec (SYMC) is in the process of building out a set of cloud security services that ultimately will make it possible for just about any MSP to add managed security services to its portfolio.

At the core of that effort are investments Symantec is making in advanced analytics, machine learning, telematics and security broker technologies that the company will use to stand up a cloud security service, said Amit Jasuja, senior vice president of Products for Enterprise Security at Symantec.



Monday, 14 September 2015 00:00

FEMA To Evaluate Readiness Of Maryland

PHILADELPHIA – The Federal Emergency Management Agency (FEMA) will evaluate a Biennial Emergency Preparedness Exercise at the Calvert Cliffs Nuclear Power Plant. The exercise will occur during the week of September 14th, 2015 to assess the ability of the State of Maryland to respond to an emergency at the nuclear facility.

“These drills are held every other year to evaluate government’s ability to protect public health and safety,” said MaryAnn Tierney, Regional Administrator for FEMA Region III. “We will assess state and local emergency response capabilities within the 10-mile Emergency Planning Zone as well as the adjacent support jurisdictions within the State of Maryland.”

Within 90 days, FEMA will send its evaluation to the Nuclear Regulatory Commission (NRC) for use in licensing decisions. The final report will be available to the public approximately 120 days after the exercise.

FEMA will present preliminary findings of the exercise in a public meeting at 11:00 a.m. on September 18, 2015, at the Sheraton Annapolis Hotel, 173 Jennifer Road, Annapolis, MD 21401. Scheduled speakers include representatives from FEMA, NRC and the State of Maryland.

At the public meeting, FEMA may request that questions or comments be submitted in writing for review and response. Written comments may also be submitted after the meeting by emailing FEMAR3NewsDesk@fema.dhs.gov or by mail to:

MaryAnn Tierney

Regional Administrator


615 Chestnut Street, 6th Floor

Philadelphia, PA 19106

FEMA created the Radiological Emergency Preparedness (REP) Program to (1) ensure the health and safety of citizens living around commercial nuclear power plants would be adequately protected in the event of a nuclear power plant accident, and (2) inform and educate the public about radiological emergency preparedness.

REP Program responsibilities cover only “offsite” activities, that is, state and local government emergency planning and preparedness activities that take place beyond the nuclear power plant boundaries. Onsite activities continue to be the responsibility of the NRC.

Additional information on FEMA’s REP Program is available online at FEMA.gov/Radiological-Emergency-Preparedness-Program.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. FEMA Region III’s jurisdiction includes Delaware, the District of Columbia, Maryland, Pennsylvania, Virginia and West Virginia. Stay informed of FEMA’s activities online: videos and podcasts are available at fema.gov/medialibrary and youtube.com/fema. Follow us on Twitter at twitter.com/femaregion3.

Monday, 14 September 2015 00:00

Emergency Management is Not a Part-Time Job

Despite the continued professionalization of emergency management and the expanded roles and responsibilities of emergency managers, many local governments still view emergency management as a part-time job with a part-time or even volunteer emergency manager.

In the grand scheme of things, contemporary emergency management is still a relatively new discipline, having only recently evolved from the civil defense era within the last 40 years or so. Many of those working within emergency management began their careers in civil defense agencies, and some of those agencies carry the civil defense moniker even today.

The tragic events of 9/11 and recent high-profile natural disasters have served to raise the profile of emergency management, and federal grant dollars and doctrine have further helped to define and shape the discipline. Today, emergency management professional certifications and academic degrees are becoming commonplace, and emergency managers are taking on new and expanded roles as the threat profile continues to evolve.



Cloud technology is currently at the forefront of IT and continues to grow as more companies begin to adopt this technology. However, there are still concerns and misconceptions when it comes to cloud adoption. A recent survey conducted by West IP Communications of more than 300 IT managers identified some glaring concerns with cloud-based file sharing and other forms of cloud services. The majority of concerns were cost or security related, as IT managers were worried that cloud adoption would endanger the company or taper the company’s bottom line.

The survey showed that although the majority of IT managers believed they would be able to make their money back in savings, 46 percent didn’t see their company earning the same ROI. The perception of potential ROI seemed to shift depending on the size of company with 66 percent large businesses, those with IT budgets of $5 million or greater, believing they would make all their money back. There was also a difference of opinion among companies in how long it would take to them to see their full return on investment.



It seems that the enterprise is both intrigued and yet intimidated by the thought of incorporating high-performance computing (HPC) into the data center.

On the one hand, who doesn’t want a powerful, scalable and highly flexible data infrastructure at their disposal? On the other, the financial, technical and logistical challenges to making it work properly are undoubtedly daunting.

Or are they? Most people view HPC in terms of the home-grown, scale-out infrastructure that populates the data centers of Google, Facebook and other Web giants. But as the technology matures, it is being integrated into increasingly modular footprints that can be incorporated into the standard enterprise footprint relatively easily.

Indeed, as Enterprise Tech’s Alison Diana found out from top executives at Cray, Psychosoftpc and other HPC specialists, enterprise deployments are seen as the next big market opportunity, which is why many of the leading platforms are being retooled with advanced power and cooling systems for deployment into critical data infrastructure. The HPC industry, in fact, is working to overcome the persistent myths that advanced computing requires specialized support, or even entirely new data centers, before it can play an active role in emerging data processes like analytics and high-speed transactions.



When data from the massive Ashley Madison hack first leaked online, one tiny bright spot was that researchers said the company appeared to use a strong algorithm to encrypt users passwords. But now one group says it already decoded more than 11 million passwords because programming errors in how that encryption was applied left the information less secure than originally thought.

And the passwords unearthed by the decoding hobbyists, known as CynoSure Prime, so far suggest that many who were seeking thrills on the infidelity-focused site had poor digital hygiene.

The top password uncovered so far: 123456, according to Ars Technica. The other passwords that made the top five aren't much better: 12345, password, DEFAULT, and 123456789.



Manufacturers are coming out with new non-volatile memory (NVM) media like 3D XPoint. Does that mean that DRAM and other NVM media such as NAND flash are now dead?

Do new NVM storage access protocols such as NVM Express (NVMe) mean SCSI/SAS and AHCI/SATA are now dead?

My simple answer is no, they all have bright futures.



This Week in Civic Tech presents a lineup of notable events in the space that connects citizens to government services. Topics cover latest startups, hackathons, open data initiatives and other influencers. Check back each week for updates.

Predicting Smog Levels

China is infamous for its horrible air quality. Proof is seen in citizens outfitted in respirator masks, its opaque skylines and, more concretely, in recent sensor-based research estimating that the country’s pollution kills about 4,000 people each day — responsible for about 17 percent of all deaths in China.

To assist, MIT Technology Review reports that IBM is collaborating with the Beijing Environmental Protection Bureau to create a predictive analytics system to forecast dangerous solution levels 72 hours before an incident.

Expectations intend for the system, labeled as Green Horizon, to eventually provide tangible recommendations to improve air quality to safe levels. Temporarily closing coal factories or regulating automobile usage are some examples for the solution that will eventually be advertised to jurisdictions worldwide.

Currently IBM reports prediction accuracy can approximate air quality within a kilometer and is 30 percent more precise than previous methods. The results all stem from IBM’s continuing investment in enterprise smart city technologies, but could foreseeably be adapted into open data solutions for civic developers who could make such insights more consumer friendly with apps. The initiative is part of China’s effort to curb its air pollutants by 10 percent by 2017.



Corporate data breaches and privacy concerns may dominate the headlines, but a new report by Allianz Global Corporate & Specialty makes the case that future cyber threats will come from business interruption (BI), intellectual property theft and cyber extortion.

The impact of BI from a cyber attack, or from operational or technical failure, is a risk that is often underestimated, according to Allianz.

It predicts that BI costs could be equal to—or even exceed—direct losses from a data breach, and says that business interruption exposures are particularly significant in sectors such as telecoms, manufacturing, transport, media and logistics.



(MCT) - Among all the apocalyptic disasters that Californians routinely prepare for -- earthquake, drought, wildfire, carmageddon -- the most welcome is rain, even though giant El Niño events like the one currently massing in the Pacific can bring their own set of calamities: flooding, mudslides, carmageddon with hydroplaning.

After four years of drought, creeks and rivers flowing through the Bay Area are more trickle than torrent. But weather scientists are recording water temperatures in the Pacific nearing the highest they've ever seen, suggesting El Niño will open an atmospheric fire hose in the jet stream this winter. That's caused a rising tide of anxiety that has left even the highest-and-driest Californians on edge.

Across the Bay Area, roofers and tree-trimmers are so busy preparing for the onslaught that many have stopped accepting new jobs. And public works crews are shoring up creek beds, clearing storm drains and stocking up on sandbags in preparation.



issued by
and the International Research Institute for Climate and Society
10 September 2015

ENSO Alert System Status: El Niño Advisory


Synopsis: There is an approximately 95% chance that El Niño will continue through Northern Hemisphere winter 2015-16, gradually weakening through spring 2016.

During August, sea surface temperature (SST) anomalies were near or greater than +2.0oC across the eastern half of the tropical Pacific (Fig. 1). SST anomalies increased in the Niño-3.4 and Niño 3-regions, were approximately unchanged in the Niño-4 region, and decreased in the Niño-1+2 region (Fig. 2). Large positive subsurface temperature anomalies persisted in the central and east-central equatorial Pacific during the month (Fig. 3), with the largest departures exceeding 6oC (Fig. 4). The atmosphere remained coupled to the anomalous oceanic warmth, with significant low-level westerly wind anomalies and upper-level easterly wind anomalies persisting from the western to east-central tropical Pacific. Also, the traditional and equatorial Southern Oscillation Index (SOI) were again negative, consistent with enhanced convection over the central and eastern equatorial Pacific and suppressed convection over Indonesia (Fig. 5). Collectively, these atmospheric and oceanic anomalies reflect a strong El Niño.

All models surveyed predict El Niño to continue into the Northern Hemisphere spring 2016, and all multi-model averages predict a peak in late fall/early winter (3-month values of the Niño-3.4 index of +1.5oC or greater; Fig. 6). The forecaster consensus unanimously favors a strong El Niño, with peak 3-month SST departures in the Niño 3.4 region near or exceeding +2.0oC. Overall, there is an approximately 95% chance that El Niño will continue through Northern Hemisphere winter 2015-16, gradually weakening through spring 2016 (click CPC/IRI consensus forecast for the chance of each outcome for each 3-month period).

Across the contiguous United States, temperature and precipitation impacts associated with El Niño are expected to remain minimal during the early Northern Hemisphere fall and increase into the late fall and winter (the 3-month seasonal outlook will be updated on Thursday September 17th). El Niño will likely contribute to a below normal Atlantic hurricane season, and to above-normal hurricane seasons in both the central and eastern Pacific hurricane basins (click Hurricane season outlook for more).

This discussion is a consolidated effort of the National Oceanic and Atmospheric Administration (NOAA), NOAA's National Weather Service, and their funded institutions. Oceanic and atmospheric conditions are updated weekly on the Climate Prediction Center web site (El Niño/La Niña Current Conditions and Expert Discussions). Forecasts are also updated monthly in the Forecast Forum of CPC's Climate Diagnostics Bulletin. Additional perspectives and analysis are also available in an ENSO blog. The next ENSO Diagnostics Discussion is scheduled for 8 October 2015. To receive an e-mail notification when the monthly ENSO Diagnostic Discussions are released, please send an e-mail message to: ncep.list.enso-update@noaa.gov.

(MCT) - It’s not exactly all downhill from here, but today marks the statistical peak of the Atlantic hurricane season.

But one look through history is a clear indication that there are no guarantees Eastern North Carolina won’t be slapped with a storm. After all, two of the 20th century’s worst hurricanes — Floyd in 1999 and Hazel in 1954 — both made landfall after the peak point of the season.

Still, 2015 has been a mild hurricane season in the Atlantic, backing up long-range forecasts by the National Oceanic and Atmospheric Administration and the National Climate Prediction Center, among others.

Tropical Storm Grace became the season’s seventh named storm last week. Up to 10 have been predicted, which would rank 2015 below the average of 12 storms per year.



Friday, 11 September 2015 00:00

Breaches Up but Compromised Records Down

The number of data breaches in the first half of 2015 has jumped 10 percent over the same time period last year, according to a new report from Gemalto. Yet, the number of records that have been compromised is down by 41 percent.

It seems like these are conflicting numbers, doesn’t it? Dark Reading provided a possible reason for this:

This decline in compromised records can most likely be attributed to that fact that fewer large scale mega breaches have occurred in the retail industry compared to the same period last year.



Scrapped, defective or outdated hard drives can be a special source of danger for companies. Because in many high-end systems stored data are not only distributed over many hard drives, but often also – due to the implemented data recovery functions – in several versions. Thus, in many cases data on scrapped hard drives of an EMC, NetApp or CommVault system containing sensitive business secrets can be reconstructed. It is therefore important, not only due to the current or coming (GDPR) data protection regulations, but also to protect intellectual property belonging to the company, to make sure that all data has been securely destroyed before the hard drives or SSDs of the high-end server or storage system employed are scrapped.

After having dealt in the previous article with the basics and the necessities of LUN erasure, the processes to be performed, and the data erasure process to be employed on a LUN that is to be erased for new users, we will now deal with another erasure process which typically requires the secure erasure of LUNs:

The erasure of LUNs, because the hardware used (an HDD) is defective or fails to function optimally and has to be replaced, e.g. when a hard drive has exceeded the established threshold of bad blocks.

What exactly does the data erasure process look like?



Although 2015 is far from over, it is already proving to be a blockbuster year for headline-grabbing data breaches at corporations and at U.S. government agencies. These “mega breaches” range from the theft of over 20 million personnel files from the U.S. Office of Personnel Management, which seriously compromised national security, to the hack and subsequent data dump of Ashley Madison, the dating website that facilitates infidelity, which exposed the identities of 32 million of its customers. Similarly, the data breach at the health insurer Anthem led to the breach of 80 million medical records, including social security numbers and birthdays, making millions of its customers vulnerable to identity theft.

As these cyber attacks highlight, the importance of cybersecurity in the modern business environment is paramount. The cost of “mega breaches” for companies can be catastrophic. For example, Target reported that its 2013 breach cost the company $264 million in direct expenses.

Even smaller, more run-of-the-mill breaches can cost businesses millions. According to a study by data security research organization Ponemon Institute, the total average cost of a data breach for companies is $3.8 million. These direct costs include hiring experts to fix the breach, investigating the cause and offering credit monitoring for victims. This number does not include business losses caused by customers being wary of patronizing a business, which the report says can eclipse the direct costs.



Thursday, 10 September 2015 00:00

The Road to Next-Generation 911

Our nation is on a journey. Our destination is having well-informed emergency response services that have enough information to quickly come to our aid during any crisis. You may be thinking that we already have that in our 911 system. And in a way, you’re right. For decades, we’ve relied on 911 for police, fire and emergency medical response. But just as the first automobiles set the stage for the advanced vehicles we drive today, the national 911 system is full of potential that has yet to be realized. We need to implement the next generation of 911 to continue receiving the emergency services that keep us safe.

Where We Have Been

To get a better sense of our ultimate goal, consider how far we have come. The national 911 system was established in the 1960s, after a presidential commission recommended the establishment of a single telephone number that could be used nationwide to report emergencies. The early system made use of the first public-safety answering points (PSAPs) to dispatch the appropriate emergency responders.



“Two [expletive] things to get off my chest: If you don’t like Peoria and want to sit here and [expletive], then leave. And two, who stole my crack pipe?” Peoria, Ill., Mayor Jim Ardis recited from a fake Twitter account created under his name in 2014.

For all intents and purposes, Ardis’ online life had been hacked. Not, of course, in the traditional sense, but more in the sense that he had left Web real estate open and squatters siezed the opportunity to have some fun.

In March 2014, a prankster launched the Twitter account @peoriamayor, which has since been suspended, as an inside joke to share with friends. The profile featured the comings and goings of the mayor’s twisted alter ego — a man hell-bent on all the liquor, heavy drugs and prostitutes he could find, according to various news sources.



Sending a notification is extremely important, but getting a response back from the recipient can be just as vital. When a crisis occurs, you often need more than just confirmation that the alert was delivered. You need information. Two-way communication transforms your notification service into a crisis management tool. By using a notification service’s two-way feature, like Send Word Now’s Get Word Back, message recipients can easily acknowledge receipt of an alert, confirm their ability to respond to a particular situation, and if necessary, account for their personal well-being.

What use cases support two-way messaging? The following scenarios illustrate some of the ways in which emergency notification service users implement this feature.



Roy Wright, Deputy Associate Administrator for Insurance and Mitigation for the National Flood Insurance Program (NFIP), reminds policyholders that the deadline for requesting a review of their Hurricane Sandy claim is Sept. 15, 2015.

“If you feel your Sandy claim was underpaid, I encourage you to call us so we can take another look and we stand ready to take your calls,” Wright said. “FEMA has begun providing funds to policyholders who completed the review and were due additional payments on their claim,” Wright said.

More than 12,500 policyholders have entered the review process so far.

Getting started is as simple as making a telephone call. To be eligible for the review, policyholders must have experienced flood damage between Oct. 27, 2012 and Nov. 6, 2012 as a result of Hurricane Sandy and must have had an active NFIP flood policy at the time of the loss. Policyholders can call the NFIP’s Hurricane Sandy claims center at 866-337-4262 to request a review.  It is important to have your policy number and insurance company name when you call.

In advance of the approaching deadline, FEMA expanded its call center hours to make it easier for policyholders to request a review. The call center operates weekdays from 8 a.m. to 9 p.m. Eastern Daylight Time (EDT), Saturday and Sunday from 10 a.m. EDT to 6 p.m. EDT.

Policyholders can go online to www.fema.gov/hurricane-sandy-nfip-claims to download a form requesting a review. The downloaded form can be filled out and emailed to FEMA-sandyclaimsreview@fema.dhs.gov or faxed to 202-646-7970 to begin the review process. For individuals who are deaf, hard of hearing, or have a speech disability and use 711 or VRS, please call 866-337-4262.  For individuals using a TTY, please call 800-462-7585 to begin the review process.

When policyholders call, it is helpful if they have available as much information as possible, including the name(s) on the policy, the address of the damaged property and the ten-digit NFIP policy number that was in effect at the time of the loss. Policyholders will be asked a series of questions to determine whether they qualify for the review. If qualified, they will be called by an adjuster to begin the review. The timing of this call may be affected by the volume of requests. Most reviews can be concluded within 90 days.

Policyholders who have already requested a review of their claim do not have to call again. They are in the system and an adjuster will continue to work with them after the Sept. 15 deadline.

The Sandy Claims Review is intended to be simple for the policyholder and does not require paid legal assistance. Several nonprofit service providers are ready to offer free advice and answer questions policyholders may have. A list of these advocacy groups can be found on the claims review website at www.fema.gov/sandyclaims.

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.

Follow FEMA online at www.fema.gov/blog, www.twitter.com/fema, www.facebook.com/fema and www.youtube.com/fema.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.

The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

Thursday, 10 September 2015 00:00

Extending Enterprise Cloud Security

Cloud computing has become such an integral part of the average consumer’s life, and the technology is quickly becoming a mainstay in the workplace. Because of the growing expectation of convenience, the workforce is turning to public cloud solutions for greater access to company data. The public cloud puts your valuable data at risk of malicious threats, but the issue is inevitable. While cloud-based file sharing presents limitless scalability, unparalleled flexibility, and greater agility, you need to be prepared for the security concerns you’re bound to encounter.

What’s so bad about the public cloud?

The common thought throughout the business world is that the public cloud presents major security issues. Speaking about the complexities of the cloud, Former U.S Secretary of Defense Donald Rumsfield once said:

There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know.



PHILADELPHIA – FEMA Region III has created four new infographics to promote individual and community preparedness. Infographics can be used in many different ways to help reach multiple audiences and guide action. We encourage everyone to use, share and promote these vibrant visual tools. Each infographic focuses on a specific topic:

Protect against a Flood Infographic. If you live in a flood prone area, taking protective measures is a must in order to protect your house and valuables. This image focuses on ways you can make your home more resilient to floods. This infographic can enhance a community newsletter or email. Download at  http://www.fema.gov/media-library/assets/documents/108453.

Pet Preparedness Infographic. A significant number of families have pets who also need to be ready for a disaster. We encourage everyone to take simple steps to prepare their pets, including building a pet preparedness kit and having a pet-friendly plan for disasters. This infographic is perfect for pet stores and veterinarian offices as it promotes pet preparedness and safety. Download at  http://www.fema.gov/media-library/assets/documents/108455.

Make Your Business Resilient Infographic. Roughly 40 to 60 percent of small businesses never reopen their doors following a disaster, but you can. Encourage workplace resiliency through planning and preparation for the unexpected. This infographic is great to share in an email to your workforce and to post on bulletin boards and breakrooms. Download at http://www.fema.gov/media-library/assets/documents/108451.

Is your Disaster Kit Stocked? Infographic. Hurricane Sandy knocked out power to 8.5 million customers for seven days. Make sure your disaster kit is up-to-date and you have a plan in place for your family. Consider this infographic and display at local stores and supermarkets. Download at http://www.fema.gov/media-library/assets/documents/108699.

By promoting preparedness, we can reduce the impact of future disasters.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. FEMA Region III’s jurisdiction includes Delaware, the District of Columbia, Maryland, Pennsylvania, Virginia and West Virginia.  Stay informed of FEMA’s activities online: videos and podcasts are available at fema.gov/medialibrary and youtube.com/fema. Follow us on Twitter at twitter.com/femaregion3.

The average temperature for the contiguous U.S. during summer 2015 (June–August) was 72.7°F, 1.3°F above the 20th century average. Record summer heat impacted the Northwest, while the central US remained cool. The August average temperature for the lower 48 was 73.0°F, 0.9°F above average, and the 31st warmest on record.

The summer precipitation total for the contiguous U.S. was 9.14 inches, 0.82 inch above average. Driven largely by rainfall early in the season, it was the 16th wettest summer on record. The August precipitation total for the lower 48 was 2.36 inches, 0.26 inch below average, and the 28th driest on record. Drought and wildfires plagued the West and severe drought emerged in the South.

This analysis of U.S. temperature and precipitation is based on data back to January 1895, resulting in 121 years of data.

U.S. climate highlights: summer (June-August)


June-August 2015 Temperature Departure from Average Map
June-August 2015 Percent of Normal Precipitation
Jun-Aug 2015 Temperature Departure from Average (top)
and Precipitation Departure from Average (bottom)
  • Eleven states across the West and Southeast were much warmer than average. Oregon and Washington each had their warmest summer on record. Oregon's summer temperature was 4.6°F above average, besting the previous record set in 2003 by 0.6°F. Washington's summer temperature was 5.3°F above average, beating the previous record set in 1958 by 1.1°F.
  • Near- to below-average temperatures stretched from the Central Plains, through the Midwest, and into the Northeast. No state was record cold. Above-average precipitation across these areas suppressed daytime temperatures, contributing to the cool summer.


  • Nine states across the Midwest and Northeast had summer precipitation totals that were much above average. Record precipitation fell across the Ohio Valley during June and July, but a relatively dry August kept the seasonal rainfall totals off the record mark. Above-average precipitation also fell in parts of the West, but this is the dry season for the region and the rainfall did little to improve long-term drought conditions.
  • Below-average summer precipitation was observed in the Southeast and Northwest. Washington had its ninth driest summer on record receiving only 52 percent of the seasonal average rainfall. The warm and dry summer in Washington created ideal wildfire conditions. By early September the Okanagan Complex Fire had charred over 300,000 acres and destroyed 176 homes. This is the largest wildfire on record in Washington, surpassing the Carlton Complex Fire that charred 250,000 acres in 2014. According to data form the National Interagency Fire Center, during summer 2015 wildfires burned nearly eight million acres in the U.S., the most since reliable record-keeping began in 2000.


  • The U.S. Climate Extremes Index (USCEI) for summer was slightly above average. On the national-scale, extremes in warm minimum temperatures and days with precipitation were much above average. The USCEI is an index that tracks extremes (falling in the upper or lower 10 percent of the record) in temperature, precipitation, land-falling tropical cyclones, and drought across the contiguous United States



August 2015 Temperature Departure from Average Map
August 2015 Percent of Normal Precipitation
August 2015 Temperature Departure from Average (top)
and Precipitation Departure from Average (bottom)


  • Below-average precipitation was observed across parts of the West, South, and Northeast where Connecticut and Louisiana were much drier than average. Parts of the Northern Plains and Southeast were wetter than average — North Dakota was much wetter than average. In the Southeast, the remnants of Tropical Storm Ericka brought beneficial rainfall at the end of the month.
  • According to the September 1st U.S. Drought Monitor report, 30.4 percent of the contiguous U.S. was in drought, up 3.3 percent since late July. Drought conditions improved across parts of the Central Plains and Northeast, where it rained. Drought worsened across the Northwest, Northeast, and Southeast. Drought conditions degraded rapidly in parts of the South where hot temperatures and lack of precipitation quickly stressed manmade and natural systems. Drought conditions remain dire in the West, where wildfires charred over one million acres in August.
    • Outside of the contiguous U.S., drought changed little in Alaska, Hawaii, and Puerto Rico. The remnants of several tropical systems impacted both Hawaii and Puerto Rico, but the beneficial rainfall did little to improve longer-drought conditions.

U.S. climate highlights: Year-to-date (January-August)


January-August 2015 Temperature Departure from Average Map
January-August 2015 Percent of Normal Precipitation
Jan-Aug 2015 Temperature Departure from Average (top)
and Precipitation Departure from Average (bottom)



  • The USCEI for the year-to-date was 35 percent above average and the 17th highest value on record. On the national-scale, extremes in warm maximum and minimum temperatures, one-day precipitation totals, and days with precipitation were much above average. The USCEI is an index that tracks extremes (falling in the upper or lower 10 percent of the record) in temperature, precipitation, land-falling tropical cyclones, and drought across the contiguous U.S.

This month's report also contains, as a supplement, an analysis of recurrent tidal (or sometimes referred to as nuisance) flooding trends at 27 tide gauges along the contiguous U.S. coast and Hawaii. The analysis indicates that the number of days with tidal flooding during 2014 continues to increase as sea levels rise, up three-fold to nine-fold at a majority of locations since the 1960s. The analysis also indicates that El Nino events increase the likelihood for tidal flooding, even above the long-term trend.

For extended analysis of regional temperature and precipitation patterns, as well as extreme events, please see our full report that will be released on September 11th.

The nation recently paused to remember the events of Hurricane Katrina, which devastated the Gulf Coast ten years ago. We at Mail-Gard also remember the impact this storm had on not one, but three of our customers. It was our first multiple declaration event, and we were honored and are proud to have assisted our customers with their disaster recovery plan, which allowed them to continue sending their critical communications.

It was a hectic time. A few of our customers temporarily moved a few employees and their families into the Warminster area to work at the Mail-Gard facility until their workplace and homes back in Louisiana were habitable again. When Hurricane Wilma hit Florida, we were called upon to assist with a fourth declaration. The 2005 hurricane season tested our capabilities and bench strength – and gave us tremendous insight into what companies really need in order to be prepared in the event of a full-scale declaration.

- See more at: http://www.iwco.com/blog/2015/09/09/backup-for-disaster-recovery-plan/#sthash.IUuzEvKZ.dpuf

The nation recently paused to remember the events of Hurricane Katrina, which devastated the Gulf Coast ten years ago. We at Mail-Gard also remember the impact this storm had on not one, but three of our customers. It was our first multiple declaration event, and we were honored and are proud to have assisted our customers with their disaster recovery plan, which allowed them to continue sending their critical communications.

It was a hectic time. A few of our customers temporarily moved a few employees and their families into the Warminster area to work at the Mail-Gard facility until their workplace and homes back in Louisiana were habitable again. When Hurricane Wilma hit Florida, we were called upon to assist with a fourth declaration. The 2005 hurricane season tested our capabilities and bench strength – and gave us tremendous insight into what companies really need in order to be prepared in the event of a full-scale declaration.

- See more at: http://www.iwco.com/blog/2015/09/09/backup-for-disaster-recovery-plan/#sthash.IUuzEvKZ.dpuf
Thursday, 10 September 2015 00:00

NOAA: El Niño may accelerate nuisance flooding

According to a new NOAA report issued today, many mid-Atlantic and West Coast communities could see the highest number of nuisance flooding days on record through April due to higher sea levels and more frequent storm surge, compounded by the strengthening El Niño, which is likely to continue into the spring.

These communities may experience a 33 to 125 percent increase in the number of nuisance flooding days, the report said.

These findings build upon two nuisance flooding reports issued last year led by NOAA scientists William Sweet and John Marra. The previously published reports show coastal communities in the United States have experienced a rapid growth in the frequency of nuisance tidal flooding, a 300 to 925 percent increase since the 1960s, and will likely cross inundation tipping points in the coming decades as tides become higher with sea level rise.

“We know that nuisance flooding is happening more often because of rising sea levels, but it is important to recognize that weather and ocean patterns brought on by El Niño can compound this trend,” said Sweet. “By using the historic data that NOAA has collected from tide gauges for more than 50 years, we can better understand and anticipate how the weather patterns may affect nuisance flooding in these communities.”

This table shows communities in the U.S. that may see an increase in predicted nuisance flooding due to El Nino. (Credit: NOAA)

This table shows communities in the U.S. that may see an increase in predicted nuisance flooding due to El Nino. (Credit: NOAA)

The new report, 2014 State of Nuisance Tidal Flooding, highlights nuisance flood frequencies during the 2014 meteorological year, May 2014 through April 2015, at 27 NOAA tide stations around the United States which have collected data for more than 50 years. The report, for the first time, gives an experimental outlook for the 2015 meteorological year that considers historical flooding trends and differences typical during El Niño.

For instance, these nuisance flooding projections are based on trends that factor in El Niño:

  • Norfolk, Virginia, experienced eight nuisance flood days during the 2014 meteorological year. It may experience 18 days in meteorological year 2015 with El Niño, a 125 percent increase over the projected eight days.

  • Atlantic City, New Jersey, had 21 nuisance flood days in meteorological year 2014. It may experience 36 days in meteorological year 2015 with El Niño, a 33 percent increase over the projected 27 days.

  • San Francisco, California, had 11 nuisance flood days during meteorological year 2014. It may experience 21 days during meteorological year 2015 with El Niño, a 75 percent increase over the projected 12 days.

A bicyclist navigates flooded path in Charleston's Battery. (Credit: NOAA)

A bicyclist navigates flooded path in Charleston's Battery. (Credit: NOAA)

The forecast for more nuisance flooding is problematic for these regions as it comes on top of the high nuisance flooding rates they experienced during 2014 and which continue to move upwards as predicted by 1950-2013 trends.

“Improving the resilience of coastal communities means helping them to understand their risks,” said Holly Bamford, Ph.D., assistant NOAA administrator for the National Ocean Service, performing duties of the assistant secretary of commerce for conservation and management. “NOAA monitors sea level trends and interprets how those trends project into the future. This is especially important for coastal community and regional planners in preparing to protect their communities from both nuisance flooding and the increased risk of storm surge which can come from it.”

Nuisance flooding causes public inconveniences such as frequent road closures, overwhelmed storm water systems, and compromised infrastructure. The extent of nuisance flooding depends on multiple factors, including topography and land cover. The study defines nuisance flooding as a daily rise in water level above minor flooding thresholds set locally by NOAA weather forecasters and emergency managers for coastal areas prone to flooding.

Nuisance flooding — which causes such public inconveniences as frequent road closures, overwhelmed storm drains and compromised infrastructure — has increased on all three U.S. coasts, between 300 and 925 percent since the 1960s. (Credit: NOAA).

Nuisance flooding — which causes such public inconveniences as frequent road closures, overwhelmed storm drains and compromised infrastructure — has increased on all three U.S. coasts, between 300 and 925 percent since the 1960s. (Credit: NOAA)

NOAA plans to continue tracking and reporting recurrent tidal flooding around the country to help communities assess their current situation and plan for future changes already underway.

NOAA’s Center for Operational Oceanographic Products and Services has measured sea levels in the United States for more than 150 years and is the nation’s authoritative source for sea level trends through its National Water Level Observation Network.

NOAA’s National Centers for Environmental Information (NCEI) is responsible for hosting and providing access to one of the most significant archives on earth, with comprehensive oceanic, atmospheric, and geophysical data. From the depths of the ocean to the surface of the sun and from million-year-old tree rings to near real-time satellite images, NCEI is the nation’s leading authority for environmental information.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on FacebookTwitter, Instagram and our other social media channels.

The title hasn’t yet been put to client vote, but Chris Sherman may be the renaissance man of Forrester’s S&R team. As an analyst, Chris advises clients on data security across all endpoints, giving him a broad perspective on current security trends. His experience as a neuroscience researcher at Massachusetts General Hospital also gives him insight into the particular challenges that Forrester’s clients in the healthcare industry face. Lastly, when he hasn’t been writing about endpoint security strategy or studying neural synapse firings, Chris flies Cessna 172’s around New England. Listen to this week’s podcast to learn about recent themes in Chris’s client inquiries as well as the troubles facing a particular endpoint security technology.



DENTON, Texas – The Federal Emergency Management Agency (FEMA) has recognized two Louisiana groups for accomplishments in working to prepare their communities for emergencies. New Orleans Medical Reserve Corps won the award for Outstanding Citizen Corps Partner Program in the 2015 FEMA Individual and Community Preparedness Awards. The Coastal Protection and Restoration Authority of Louisiana received an honorable mention in the category of Technological Innovations.

The New Orleans Medical Reserve Corps (NOMRC) was honored for its emergency preparedness outreach and education to vulnerable populations in the city by working with the whole community. NOMRC coordinated with emergency preparedness agencies in the city to plan for the 2014 hurricane season. Working with agencies and health care providers that serve at-risk groups, NOMRC developed easy-to-understand hurricane preparedness messages and materials targeted to the elderly, mobility-impaired, non-English speakers and other vulnerable populations. Innovative partnerships with many local organizations and meeting people where they are in the community are hallmarks of NOMRC’s success. Some of their accomplishments include:

  • Providing presentations on mandatory evacuations, sheltering in place and preparing disaster supplies at community events, health fairs, in senior citizen living facilities, and through door-to-door campaigns.

  • Hosting 10 training sessions on emergency preparedness and resiliency for agencies serving at-risk groups.

  • Using Medicare data to locate individuals on oxygen or dialysis for targeted outreach.

  • Using New Orleans Regional Transit Authority data to survey people with limited mobility.

  • Discussing hurricane preparedness with elementary school children at a hurricane hunter aircraft site.

The Coastal Protection and Restoration Authority of Louisiana worked with a number of community partners to design a Flood Risk and Resilience Viewer. The web-based tool displays flood risk data in an easy-to-understand format for a specific location. It helps individuals and communities understand their current and future flood risks, shows probable land loss and climate changes in the future, and how flood depths could impact the community and its infrastructure.

Each year, FEMA’s Individual and Community Preparedness Division recognizes the preparedness efforts of organizations around the country. This year 138 organizations applied for recognition. There were 11 award winners and 37 honorable mentions. The 11 FEMA Individual and Community Preparedness Award recipients were recognized on Sept. 8, 2015, in Washington, D.C. During the recognition ceremony, recipients shared their experiences, success stories, and lessons learned with fellow emergency management leaders.

Visit www.ready.gov/citizen-corps/citizen-corps-awards for more information on this year’s award recipients and honorable mentions.


FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. 

Follow us on Twitter at http://twitter.com/femaregion6 and the FEMA Blog at http://blog.fema.gov.

SAIPAN – CNMI residents can assist with the state’s recovery from the damage caused by Typhoon Soudelor August 1-3, while building their professional skills and drawing a paycheck. Temporary, full-time positions are available locally with the Federal Emergency Management Agency.
FEMA has joined forces with the CNMI Department of Labor to recruit and screen individuals to work in positions including administration, accounting, engineering, courier, writing and public information, and television/radio broadcast production, planning, individual disaster assistance and logistics.

FEMA routinely offers employment to residents in disaster areas to support the local economy and provide jobs to those who may have lost employment due to the event.

Applicants must be U.S. citizens, 18 years of age or older. Individuals will be required to pass a background investigation that includes finger printing and credit check.

Job descriptions are available at http://www.wia.gov.mp/ and at http://www.marianaslabor.net/. CNMI residents may apply at e-mail:   fema-dr-4235-mp-hire-me@fema.dhs.gov.

FEMA is committed to employing a highly qualified workforce that reflects the diversity of our nation.  All applicants will receive consideration without regard to race, color, national origin, sex, age, political affiliation, non-disqualifying physical handicap, sexual orientation, and any other non-merit factor.  The federal government is an Equal Opportunity Employer.

For the latest information on CNMI’s recovery from Typhoon Soudelor, visit FEMA.gov/Disaster/4235.

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Last Updated: 
September 9, 2015 - 17:15
State/Tribal Government or Region: 
Thursday, 10 September 2015 00:00

One Month After the Typhoon

SAIPAN – The Federal Emergency Management Agency’s (FEMA) Incident Management Assistance Team (IMAT) was deployed to Saipan days prior to Typhoon Soudelor making landfall.

After Typhoon Soudelor impacted Saipan, President Obama issued a major disaster declaration on August 5, 2015 for the Commonwealth of the Northern Mariana Islands, making federal assistance available to aid individuals and communities in the recovery efforts for damages that occurred August 1-3, 2015 and appointed a Federal Coordinating Officer to lead the Federal team.

FEMA immediately started to mobilize its federal partners in support of recovery efforts.

Mobile Emergency Response Support (MERS) personnel and equipment supported the Commonwealth with secure and non-secure voice, video and information services to support emergency response communications needs.

Representatives from the U.S. Assistant Secretary of Preparedness, Department of Health and Human Services (HHS) assessed public health needs and were prepared to support if necessary.

The US Coast Guard was on the ground in the Commonwealth of Northern Mariana Islands (CNMI) to perform port condition assessments.  The Ports in CNMI are open to commercial vessel traffic and cargo operations.

The Environmental Protection Agency (EPA) was tasked with the assessment of water systems (potable, non-potable and wastewater) for protection of public health, preparation of a debris monitoring plan, and addressing non-industrial hazardous waste issues.   At the request of FEMA and CNMI, EPA has collected over 350 damaged transformers for eventual recycling.  EPA has set up a household hazardous waste drop off station and will be hosting weekend residential drop offs for household hazardous waste and white goods in upcoming weeks. 

The United States Army Corps of Engineers (USACE) was mission assigned to deploy the following ESF#3 cells: the Temporary Power Planning and Response Team (PRT) and Advanced Contracting Initiative (ACI), the 249th Prime Power Alpha Company, the 249th Delta Company and USACE debris subject matter experts (SMEs) to conduct rapid assessments of critical infrastructure for temporary power requirements and assist FEMA with debris management technical support.

In just one month 7,934 homeowners and renters have been registered for assistance and over $12 million approved in individual aid from the Federal Emergency Management Agency (FEMA) since Typhoon Soudelor impacted the small island of Saipan.

The Disaster Recovery Center located at the Pedro P Tenorio Multi-purpose Center in Susupe has received over 3,500 visitors seeking assistance with their FEMA and Small Business
Administration (SBA) applications.

FEMA continues to provide resources through air and sea transportation including, commodities and power restoration needs such as generators and power poles.  To date: 99,359 liters of water, 71,136 meals, 396 cots, and 1,734 tarps of FEMA commodities have been distributed to the
CNMI government and survivors.

Commonwealth Utility Corporation (CUC) power restoration crews, USACE 249th Prime Power Battalion Delta Company Team, CNMI and Guam Power Authority are working diligently to restore power everywhere throughout the island. 

The Saipan International Airport and the seaport have both resumed 24/7 operations.

65 wells are currently operable with 5.3 million gallons of potable water being pumped into the system daily brining the distribution system to 51% capacity.  Intermittent water distribution is now available to 80% of the population.

Over 12,273 cubic yards of debris has been cleared from public right-of-ways.

Nine of the 14 Private Sector drinking water/bottling companies are producing over 75,000 gallons of drinking water per day, and that number continues to increase as additional Private
Sector companies get back on line.

USACE has installed 68 generators and is continuing additional assessments, installations, and de-installations as power is restored.  USACE has power experts from the 249th Prime Power
Battalion and Delta Team supporting critical infrastructure power requirements.

FEMA and CNMI are working closely together forming a unified effort in support of the citizens of Saipan. 

SBA offers low-interest disaster loans to businesses of all sizes, most private nonprofit organizations, homeowners and renters.  To date the SBA has approved 24 loans totaling over $1.6 million dollars.

“Without the full cooperation of all our federal, commonwealth, private sector partners and the volunteer agencies, we could never accomplished as much as we have in this short period of time.”  Federal Coordinating Officer Stephen M. DeBlasio Sr. stated, “We will continue to work together as a team helping the residents of Saipan recover from the effects of Typhoon Soudelor.”

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Last Updated: 
September 9, 2015 - 17:58
State/Tribal Government or Region: 

After more than 20-some years of analysis and observation of BCM programs, patterns have emerged.  Some programs have been very successful, while some never quite achieved much – and others withered away.  From those observations, here are 5 common, very simple reasons why too many BCM programs fail to prosper.  I’m certain there are others.  But if your BCM Program is struggling with one of these, make an effort to fix it – or update your resume:



WASHINGTON — In an investigation involving guns and drugs, the Justice Department obtained a court order this summer demanding that Apple turn over, in real time, text messages between suspects using iPhones.

Apple’s response: Its iMessage system was encrypted and the company could not comply.

Government officials had warned for months that this type of standoff was inevitable as technology companies like Apple and Google embraced tougher encryption. The case, coming after several others in which similar requests were rebuffed, prompted some senior Justice Department and F.B.I. officials to advocate taking Apple to court, several current and former law enforcement officials said.



Wednesday, 09 September 2015 00:00

MSP Pricing Should Be About Risk, Not Flat Fees

If value is what you deliver as an MSP, why are you still charging your customers a flat fee based on number of devices or users? Wouldn’t it be better to make calculations based on value, and price your services accordingly?

Of course it would, but value-based pricing isn’t easy. It requires a formula that accounts for risk levels, support commitments, and all the costs associated with delivering a service reliably and effectively.

Such formulas can get fairly complicated, which largely explains why MSPs for the most part have relied on a per-device, per-user pricing model. The model has worked well enough but is far from perfect.



Even while more and more businesses turn to MSPs to manage their growing data collections, many still hesitate to adopt cloud-based file sharing due to misconceptions about its potential security risks. Every few months brings reports of a new leak of personal information from even the largest corporations and organizations—banks, retailers, government offices—that manage data about income, credit history, bank accounts, and other sensitive information.

With security threats on the rise, businesses are understandably wary about where and with whom they entrust their customers’ information, a mindset that could be costing MSPs potential clients. Here are some of the most common misconceptions about the cloud that could be preventing potential clients from utilizing the cloud-based file sharing services of MSPs.



Wednesday, 09 September 2015 00:00

Defining Reputational Risk

The following article is part of a new blog series that will explore ideas, concepts, discussions, arguments and applications associated with the field of enterprise and strategic risk management.

One of the more striking conclusions contained in Aon’s 2015 Global Risk Management Survey is that damage to reputation and/or brand was considered by the survey cohort to be the most significant risk to the enterprise. The survey was conducted in Q4 of 2014 and received input from over 1,400 respondents coming from both the private and public business on a worldwide basis.

The “Top Ten” most identified risks included:



Wednesday, 09 September 2015 00:00

The Myth of Moving to the Cloud

This is the 2nd in a series of articles examining the “myths” of today’s Business Continuity Management industry.

The emergence of “Cloud” technologies in the past decade has created both benefits and risks.  Whether simply backing critical data up in the cloud, moving applications there, or implementing a full-fledged DRaaS (Disaster Recovery as a Service) program, it is important to remember Murphy’s “law” that says “Anything that can go wrong, will” supplemented with MacGillicuddy’s corollary: “At the most inopportune time”.

The ‘myth’ in the case of the Cloud is simply set it and forget it.  C-suites all over the world have been lulled into the belief that they no longer have IT risks because their data is in The Cloud.  It’s not that simple – or that easy.  Potential risks in the Cloud are no different than those in a corporate data center: cyber security, data corruption and potential data loss or breach.



Wednesday, 09 September 2015 00:00

Still Some Life in (Some) Data Center Hardware

It appears that enterprise hardware is doing quite well, thank you very much, despite the much ballyhooed rise of software-defined cloud computing infrastructure.

According to IDC’s most recent Worldwide Quarterly Tracker, sales of servers, switches and routers are all in the black, with servers in particular showing the best results in more than a year. While the report shows stronger performance for ODM servers versus stalwarts like HP and Dell, results were positive nearly across the board. The only anomalies were IBM, down nearly 33 percent, and Lenovo, up more than 550 percent, which is undoubtedly the result of the transfer of the IBM server line to Lenovo.

Ethernet switches saw more muted gains of just over 1 percent, but it is an indication that the enterprise is not scaling back purchasing just yet amid all the talk of software-defined networking. Routers, meanwhile, jumped 11.5 percent, which represents in part an 8.3 percent gain in enterprise sales and 7.7 percent for the service market.



Tsunami racers, take your marks! This Sunday, September 13, Race the Wave participants will practice the tsunami evacuation route from the coast to higher ground in Cannon Beach, Oregon. We know that increasing preparedness levels across the board means greater community resilience, and Race the Wave is a great event to highlight during National Preparedness Month this September.

The race finishes at the higher ground of one of the community’s evacuation meeting points, where Cannon Beach will host a preparedness fair with interactive booths to learn more about how to prepare for emergencies and disasters.

Runners at starting line of race
Residents of Cannon Beach, Oregon gather for a 5K race following a tsunami evacuation route from the beach to a safe meeting spot. The race helped residents build the "muscle memory" of getting to safety, if they should feel an earthquake while they are on or near the beach.

Race the Wave uses the National Preparedness month themes of being disaster aware and taking action to prepare and makes those themes relevant for their community.

  • Know the Plan: Make a plan with your family about where you will meet. Know if you need to pick your kids up from school. Know where you need to go and what to bring with you.
  • Take the Route: Become familiar with signage in your area. Learn the evacuation route from where you live, work, and play. Evacuate on foot and avoid traveling by car if possible.
  • Race the Wave: Natural warnings are the best sign of a tsunami. If you feel the ground shaking, move quickly inland or to a higher elevation. Listen to the radio to learn of tsunami warnings originating from non-local causes.

This is the second annual Race the Wave event, which includes a 10k, 5k and 2k for all abilities to participate in; participants can run, walk or roll the route. Visitors and locals alike will learn about the risks posed by the Cascadia Subduction Zone, and what they can do to stay safe.

The Community of Cannon Beach, Clatsop County Office of Emergency Management, Oregon Office of Emergency Management, Oregon Department of Geology and Mineral Industries (DOGAMI), Oregon Office on Disability & Health at Oregon Health & Science University and the Federal Emergency Management Agency (FEMA) Region X office are coming together to support Race the Wave.

FEMA encourages everyone to take steps to become better prepared for an emergency. Whether it’s at home, at work, at school, or in the community, there’s a lot that you can do to be ready and help others be ready too.  This September, take time to get disaster prepared and take action to prepare.

What you can do:

The majority of UK and US consumers between the ages of 25 and 35 are pessimistic about the level of protection afforded to their online data.

This is according to a new study from Intercede, seen by eWeek, in which 1,000 millennials on either side of the Atlantic were polled on whether they trusted companies to hold their personal information securely.

Just five per cent of respondents said they consider their data to be adequately protected from accidental or malicious exposure, while more than two-thirds (70 per cent) believed threats to their online privacy are on the rise as the world becomes more digitally connected.

“We need to think more about how … you prevent the misuse of data and give transparency to the consumer about the degree to which their information is being shared,” Intercede chief executive Richard Parris told eWeek.

Some of the measures companies might use to protect against data breaches include stronger access controls, including two-factor authentication, and secure data deletion methods.

When it comes to the secure deletion of end-of-life data, you can rely on the accredited software and hardware from Kroll Ontrack.

From:: http://www.krollontrack.co.uk/company/press-room/data-recovery-news/millennials-pessimistic-about-data-protection,-says-study848.aspx

Wednesday, 09 September 2015 00:00

The Unwanted Internal Auditor

In a growing small business, introducing an internal auditor into a team can be a challenge.  In many cases, established units have a difficult time accepting change and welcoming strangers.  When that stranger is going to look at practices that may have been implemented years ago by the same staff doing the same job they’ve done for years, maybe even decades, that challenge is intensified.  Also, staff may not understand what an internal auditor is, especially if the audit concept was recently introduced.  Staff that does have experience with being audited usually describes the practice as time-consuming torcher where they are in constant fear of the informant running back to the boss with every little finding.

As an Internal Auditor, what steps can you take to ease the tension and promote cooperation?

Introduce Yourself

It is amazing what walls will be taken down with a simple “good morning; this is whom I am, this is what my goals are and I am not a threat.”  Many team members just need reassurance that you aren’t coming in “guns ablaze,” or in other words, that you weren’t hired by the company to find and terminate all employees who have ever made a mistake.  When connecting with teammates, be personable and considerate.  Remind the staff that you are part of the team and are essentially working for them, looking for areas where development of policies and procedures would make their everyday functions clearer and more efficient and at the same time, confirming that all staff is producing quality service in alignment with already documented policies and procedures.



PHILADELPHIA, Pa. – National Preparedness Month is here and FEMA Region III encourages everyone to take action to prepare. Beginning this month, FEMA Region III will use social media to promote a new preparedness campaign around hashtag “Throwback Thursday,” but with a preparedness focus (#tbtPrep). Region III will use this hashtag to focus on past regional and national disasters to help inform and guide preparedness actions so individuals and communities are better prepared. FEMA Region III will also push “Take Action Tuesday” (#TakeActionTue) messages, which will emphasize meaningful actions to build preparedness and reduce our risk to disasters.  Thursdays we remember.  Tuesdays we take action.

This social media campaign will ensure a constant drum beat of preparedness, providing everyone with the necessary tools.  FEMA Region III encourages the public, private businesses, organizations and individuals to use both hashtags to promote preparedness and guide meaningful actions to reduce individual and community risk.

Each of us can make a difference and promote preparedness. “By remembering past disasters and taking active steps to prepare today, we can reduce the impact that future disasters will have on all of us,” stated FEMA Region III Regional Administrator MaryAnn Tierney.

To take part, follow us on Twitter at twitter.com/femaregion3 and share preparedness information with your followers, family, and communities. For additional information on preparedness and to get involved, please visit FEMA.gov, Ready.gov, and America’s PrepareAthon!.

(TNS) - Parents, officers are on duty to keep your children safe.

“If there isn’t a feeling of safety and security on campus, kids aren’t going to learn. They’re not going to feel comfortable there,” Jon Best, the district’s director of student services, said.

Sergeant Daniel Marmolejo doesn’t hesitate when approaching students at Redlands High School—not only because of his friendly and outgoing demeanor, but because during his 20 years working campus safety and security at the school, he has found that building relationships with students is an effective way to keep them safe.

The Redlands Unified School District has six safety and security officers each at Redlands, Redlands East Valley and Citrus Valley high schools. Orangewood High School has a lead security officer and another officer. The middle schools have a lead security person and some monitors who are employed throughout the day. Campus monitors are stationed at the elementary school campuses throughout the day.



Tuesday, 08 September 2015 00:00

Healthcare Cyber Attacks On the Rise

Even as Health Insurance Portability and Accountability Act (HIPAA) regulations take hold, a potentially rewarding vertical market for cloud adoption can be found in healthcare technology. The demand on managed service providers (MSPs) will continue to increase as the industry’s need to secure data storage and cloud-based file sharing grows.

A recent study from security research firm Ponemon found that cyber thieves are costing the U.S. healthcare system an approximate $6 billion annually. Criminal cyber attacks on the healthcare industry have increased a startling 125 percent within the past 5 years, and nearly 90 percent of survey respondents reported having some sort of data breach in the past 2 years.



(MCT) - Oil-waste disposal regulations that seem to have tamped down earthquakes in southern Kansas are set to expire in less than two weeks, the chief of the Kansas Geological Survey said Wednesday.

But while quakes have declined in recent months, Rex Buchanan, the interim director of the geological survey, cautioned against complacency.

“In spite of the fact activity has been lower over the last few months, I don’t think there’s anybody in my world who views this problem as one that has gone away or is going away,” Buchanan said. “I think we would be pretty short-sighted if we did look at this that way. We’ve got to look at other places and we’ve got to be better prepared than we were last time.”

The Kansas Corporation Commission passed regulations in March to limit the underground disposal of saltwater that comes up with the oil pumped out of wells mainly in Harper and Sumner County.



(MCT) - Missed warning signs, lack of communication and inadequate training hampered the treatment of the man who became the nation’s first patient diagnosed with the deadly Ebola virus, a panel of independent experts concluded.

But the lessons learned from the Dallas case could mean the next Ebola patient will get a faster diagnosis — and perhaps a quick trip out of a local hospital and into a specialized treatment center with staff experienced at handling the infectious disease, officials said.

The conclusions were reached by a panel of five medical experts who volunteered their time to analyze the treatment of Thomas Eric Duncan, a Liberian man who was admitted to Texas Health Presbyterian Hospital Dallas last September with symptoms later confirmed as Ebola.



PHILADELPHIA – FEMA Region III has developed a planning integration guide titled Plan Integration: Linking Local Planning Efforts, which is aimed at helping communities link mitigation principles and actions with various community plans in order to increase community resilience. The guide leads planners and community officials through synchronizing plans and facilitating interagency coordination to reduce risk before and after a disaster.

Use of the planning integration guide enhances risk reduction through community-wide planning by improving coordination; developing specific recommendations for integration into community-wide plans; compiling existing plan measures to include in your hazard mitigation plan; and meeting the Local Mitigation Plan Review Tool requirement to integrate hazard mitigation.

“The guide, Plan Integration: Linking Local Planning Efforts, is a tool communities can tap into to strengthen resiliency through enhanced hazard mitigation planning. Community resilience is directly tied to recovery, which means this resource has the potential for impacting all phases of the full disaster cycle,” said FEMA Region III Regional Administrator MaryAnn Tierney. “This kind of pilot program lays the foundation for stronger resilience in any community. When community planners who live and work in communities set their own resilience priorities they take ownership of mitigation planning and the enthusiasm that generates inspires others to do the same – and that can help jump-start even more widespread success.”

The planning integration guide uses step-by-step instructions and a checklist, real-world examples from communities, and illustrations to assist in gathering and organizing information. Through use of the guide and its resources and tools, communities can develop their own plan integration document as well as identify where gaps exist and develop strategies to address the gaps. The end result of this effort is a synchronized planning effort to increase community resiliency and reduce the risk posed by disasters.

Plan Integration: Linking Local Planning Efforts  is available at https://www.fema.gov/media-library/assets/documents/108893. For further information about the guide, contact fema-r3-hm-planning@fema.dhs.gov.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. FEMA Region III’s jurisdiction includes Delaware, the District of Columbia, Maryland, Pennsylvania, Virginia and West Virginia.  Stay informed of FEMA’s activities online: videos and podcasts are available at fema.gov/medialibrary and youtube.com/fema. Follow us on Twitter at twitter.com/femaregion3.

Tuesday, 08 September 2015 00:00

HP Advances IT Security Analytics

In a development that could provide a lot of relief to IT organizations pressed by IT security challenges, Hewlett-Packard this week unveiled an appliance through which it will apply analytics delivered via the cloud to simplify IT security along with an update to its Fortify application scanning software that makes use of machine learning to more accurately identify potential security issues.

At the HP Protect 2015 conference, HP unfurled an HP DNS Malware Analytics service that makes use of an appliance that gets installed next to a DNS server. As network traffic moves through that appliance, an HP cloud service analyzes it to identify clean traffic.



Now that enterprise infrastructure is gravitating toward more modular, white-box configurations, attention has shifted up the stack to find ways to squeeze more performance from virtual and cloud-based data environments.

The need for advanced software-based architectures has long been apparent, but it is only lately that IT executives are starting to take a serious look at how they are to be designed. How flexible should they be? How much automation is required? What sort of life expectancy is reasonable? And who, or what, should be responsible for management, governance and oversight?

The answers floating around these days run the gamut from stolid, predictable architectures that can be provisioned and scaled to meet emerging data loads to free-wheeling, application-centric designs capable of building themselves up and tearing themselves down on a whim. The emerging discipline of Enterprise Architecture is dedicated specifically to guiding the enterprise through these seemingly contradictory approaches.



Tuesday, 08 September 2015 00:00

McAfee report shows ransomware on the rise

The three months to June 2015 saw ransomware continue its unstoppable rise as one of the most common risks to business data worldwide.

This is according to McAfee Labs’ latest Threats Report, published on September 1st, in which the antivirus vendor said it had detected 58 per cent more new samples of the malware type over the quarter than it did between April and June 2014.

Commenting on the figures, Raj Samani of Intel Security also noted that ransomware has become simpler to deploy due to “crimeware services that provide attackers with user-friendly graphical user interfaces or consoles to customise attacks”.

“All attackers have to do is fill in the email addresses they want to target and wait for the money to come rolling in,” he told Computer Weekly.



Tuesday, 08 September 2015 00:00

Appreciating the IT Labor Factor

While this is the week the United States celebrates the contributions organized labor movement played in the development of the country, it’s also a good time to appreciate the critical role IT labor plays in the success of any MSP.

More often than not the biggest limiting factor that any MSP faces today is its ability to attract IT talent. The simple fact is that IT has never been more complex to manage. The problem is that finding and retaining people with not only the right IT skills but just as importantly the right attitude, has never been more difficult.

Nowhere is that a bigger issue than in the realm of security, where the unemployment rate for IT professionals with IT security expertise is essentially zero. For that Larry Cecchini, president and CEO of Secure Designs, a provider of managed security services based on Greensboro, North Carolina, said one of the most important decisions his company ever made was to set up shop near seven different local colleges. While Secure Designs does everything it can to hold on to talent, Cecchini said the MSP relies heavily on talent recruited from local colleges to replenish its ranks. Given the fact that many of those students have ties to the local area, Cecchini said it’s a lot more practical to grow his own talent base than it is to hope the right candidate someday wants to move to Greensboro.



Tuesday, 08 September 2015 00:00

Uncovering the Real Value of the Cloud

Even though the cloud is becoming old news in the enterprise industry, there is still a lot of work to do when it comes to creating the kinds of data environments that meet the performance needs of emerging workloads.

In many cases, the cloud just sort of happened to the enterprise and was simply incorporated into legacy infrastructure with varying degrees of success. The main job going forward, then, is to transform the cloud from a collection of parts into a unified ecosystem, which in all likelihood will prove to be as difficult a job as it was in the local data center.

According to a new report by Logicalis, the divergence of technology and capability across the cloud is substantial. Particularly when it comes to key requirements like data protection, disaster recovery and networking services, clouds can range from basic consumer-level functionality to the ultra-scalable, ultra-secure environments required of health care, financial and other industries. This means the typical enterprise has to worry just as much about over-performance in the cloud as under-performance. The best way to handle this, of course, is to gain a realistic view of the workloads you intend to migrate and the levels of service they require—particularly in areas like uptime, data replication/retention and infrastructure support.



Friday, 04 September 2015 00:00

Value Proposition of Resilience

The degree of interdependence across critical infrastructure sectors has been amplified by globalization, advanced technologies and supply chain pressures. Our team at Johns Hopkins University Applied Physics Laboratory is studying — through modeling, analyses and empirical research in places such as the Port of Baltimore and Austin, Texas — the measurable impact of disruptive events, governance and societal demands upon resilience ecosystems in bounded geographic areas.

Governments, communities and individuals are not helpless in the face of natural disasters like Typhoon Haiyan, the category-5 super typhoon that struck the Philippines in November 2013, killing thousands and displacing hundreds of thousands. There are practical safeguards that can be designed within the multidisciplinary worlds of engineering, cyberphysical, and the social, behavioral and economic sciences if we systematically identify the independent variables that contribute to critical infrastructure interdependencies, conduct analyses that support a generalizable model, and test these methods under simulated and real-world conditions. Drawing from the principles of collective action theory and computational analytics, our studies are seeking to quantify the cost accounting and value proposition behind resilience by integrating economic factors into the research.



Most IT organizations spend a fair amount of time trying to figure out the actual cost differential between delivering IT services via a public cloud versus on premise. There’s no doubt that virtual machines running in a public cloud are going to be less expensive, but when all the costs of delivering an IT service are fully loaded, the public cloud is not always the cheaper choice.

To help IT organizations sort through that financial morass, Cloud Cruiser this week announced it has added CloudSmart-Now templates to its cloud financial management service. The templates make it simpler for IT organizations to figure out their true costs across hybrid cloud computing environments.

Deirdre Mahon, chief marketing officer for Cloud Cruiser, says the templates were built around a service that collects cost data from Amazon Web Services (AWS), Azure, Windows Azure Pack (WAP), VMware and Openstack. The entire IT financial analytic service can be set up in as little as five days and continually monitors pricing changes being made by the “Big Five” cloud service providers. Using that data, IT organizations can then make a more intelligent choice about where to host any given application workload.



Assumptions can be the downfall of even the best Business Continuity Plan (we’ve addressed that issue in an early blog).  Sometimes it’s not the overt assumptions we make (“All critical IT systems will be available within 4 hours of the disruption.”) but the ones we don’t realize we’ve made that may jeopardize our ability to recover – or sustain a recovery.

Chief among these, of course, is the unspoken assumption that our Plan will work, everything will be restored in short order and our business will be back to ‘normal’ within hours – or at worst in a day or two.  It’s a common assumption.  Even if it’s not written into the Plan, that short recovery horizon may be implied -simply because we don’t plan for what we may need if the recovery takes longer (a week or a month or more).  It is curious that many organizations don’t plan for prolonged disruptions – especially when the majority of Business Impact Analyses (BIA’s) ask about impacts over extended time periods.  Why ask a process owner what resources they’ll need 4 weeks after the disruption – then only require them to plan for a 48-hour recovery?



Small businesses not prepared for disaster

Three in four small business owners in the US do not have a disaster recovery plan, but more than half say it would take at least three months to recover from a disaster. For companies with fewer than 50 employees, only one in five (18%) have a disaster recovery plan. That is according to a new survey of US small business owners conducted by Harris Interactive on behalf of Nationwide Direct and Member Solutions.

Small businesses are least likely to have disaster recovery insurance,” says Mark Pizzi, president and chief operating officer of Nationwide Direct and Member Solutions. “And yet they are the ones most affected by a disaster. That’s why it’s essential for small businesses to have a disaster recovery plan.”

For many without a plan for their business, disaster recovery is simply a low priority (34%). Time (11%) or cost (15%) both play less of a role in the decision not to have a written disaster recovery plan in place. Nationwide Direct note that America's small business owners may be feeling overconfident as one in four (26%) believe the likelihood of a natural disaster occurring in their area is slim and just over one-third (37%) say climate change and the weather phenomenon El Nino have decreased the likelihood of a natural disaster impacting their business.

Perhaps that overconfidence is also reflected in the Business Continuity Institute's latest Horizon Scan report which showed that business continuity professionals working for SMEs globally were less concerned about the prospect of a natural disaster than larger organizations. For example, with adverse weather only 41% expressed concern about this threat materialising, whereas this figure was 55% for larger organizations.

Given the pervasiveness of SaaS applications like Office 365 and Salesforce, you’d think we’d have a pretty good handle on SaaS data protection by now. But according to Jeff Erramouspe, we’d all probably be surprised by how many IT departments, users and executives have failed to fully understand the nuances of proper SaaS data backup and recovery.

Erramouspe is vice president and general manager of EMC’s Spanning unit, which provides data backup and recovery for cloud applications. In a recent email interview, Erramouspe shared some misconceptions about SaaS data protection, beginning with the notion that SaaS application data doesn’t need to be backed up:

While it is true that SaaS vendors do protect and replicate their customers’ data, they only do it to protect the customer from problems on the SaaS application infrastructure side, such as server failures or drive crashes. They don’t necessarily provide bullet-proof protection from user-driven data loss. You’d be surprised by how many experienced IT professionals don’t know this. While the cloud is a great place to cost-effectively run applications, accidental deletion and other mistakes can cause losses from which Google, Microsoft, and Salesforce.com can’t always help you easily recover. For Google, its policy states that if you permanently delete something, it’s not recoverable—it’s gone forever. Salesforce has a paid service to get data back, but it is expensive ($10,000 per incident), takes time (as much as up to three weeks to get started) and it only commits to best efforts—most data can’t be restored in full. And the Microsoft Office 365 SLA doesn’t include data recovery, despite the belief of many customers that it does.



(MCT) - Fort Lauderdale, Fla. City commissioners fear the regional 911 dispatch system the city joined a year ago isn't doing the job it was supposed to, and is putting visitors and residents at risk.

They're asking City Manager Lee Feldman to come up with a Plan B — including the potentially expensive option of leaving the system — if the county can't quickly fix the emergency dispatch system's continuing problems.

Feldman sent a letter last week to County Administrator Bertha Henry critical of the "underperforming" system and requesting a meeting with the participating cities to discuss performance issues and how they will be resolved.



Friday, 04 September 2015 00:00

Enterprises warned not to ignore shadow IT

Rather than ignore or attempt to block the use of unsanctioned apps and devices in the workplace, organisations should seek to understand what it is that actually drives their users to shadow IT.

This is according to Julian Cook, director of UK business at M-Files, who warned today (September 3rd) that only by working together with employees can enterprises combat the security risks caused by the use of technology outside of IT’s control.

“Understanding the use of unauthorised devices and apps will allow stakeholders … to identify and agree sanctioned solutions,” he said, arguing that this will support “both security and data protection across the business”.



The CloudBridge Connector feature of the Citrix NetScaler appliance connects enterprise datacenters to external clouds and hosting environments.

With it, you can configure a CloudBridge Connector tunnel between two different datacenters to extend your network without reconfiguring it, and leverage the capabilities of the two datacenters. Having a CloudBridge Connector tunnel configured between the two geographically separated datacenters enables you to implement redundancy and safeguard your setup from failure.

The CloudBridge Connector tunnel helps achieve optimal utilization of infrastructure and resources across two datacenters. The applications available across the two datacenters appear as local to the user.

To connect a datacenter to another datacenter, you set up a CloudBridge Connector tunnel between a NetScaler appliance that reside in one datacenter and another NetScaler appliance that reside in the other datacenter.



As IT leaders have grown more comfortable with the security of software-as-a-service offerings and cloud storage, they also have started turning to cloud-based managed security services. For both commoditized basic services such as vulnerability testing and cloud security gateways to more sophisticated identity management and threat analysis, public-sector chief information security officers are growing more willing to consider managed security service providers (MSSPs).

Cost savings are one obvious consideration, but so is the fact that state and local governments are finding it next to impossible to compete with the private sector for cybersecurity talent. In a 2015 NASCIO state government IT workforce study, 67 percent of respondents said security was the most difficult position to fill and retain.

“Security is becoming highly specialized, and we are having a very difficult time finding appropriate people to do in-house security,” said Ralph Johnson, chief information security and privacy officer of King County, Wash., whereas a managed security services team often has the expertise and concentration he needs. For example, King County uses a managed security service for its network log and security event management. “For me to appropriately run that with an in-house solution, I would have had to hire three staffers and that would have been their sole function,” Johnson explained. “That would cost me $1.5 million over five years. I got a managed security product from a vendor that cost me $850,000 over the same time period.”



Thursday, 03 September 2015 00:00

MSPs: Remain Compliant in Any Industry

One of the best ways for managed service providers (MSPs) to expand their client base is by reaching out to new industries. This produces a large pool of potential new clients and can build an MSP’s reputation, customer trust, and brand recognition.

With any venture into new industries, MSPs need to be certain that they comply with the regulations and legal requirements specific to that sector. This not only ensures that an MSP operates within the boundaries of the law when managing and archiving often-sensitive data over cloud-based file sharing, but helps the MSP to gain additional expertise that will make its services indispensible to new clients.

Businesses are struggling to protect data. As much as 20 percent of the files shared insecurely over the cloud contain personal information that should be made public according to compliance laws. This is a prime opportunity for MSPs to build their client base by reaching out to businesses that handle private information, providing them with a secure data management system.



Two governors, on opposite sides of the country, took executive action to beef up cybersecurity in their respective states on Monday, Aug. 31. California Gov. Jerry Brown and Virginia Gov. Terry McAuliffe both instituted aggressive cybersecurity orders to prepare for and defend against potentially damaging cyberattacks in their states.
While both mandates are geared toward the implementation of better cyberprotection protocols, Brown’s order outlined the need for a multi-stakeholder California Cybersecurity Integration Center (Cal-CSIC) under the state’s Office of Emergency Services (OES).
Brad Alexander, spokesperson for the OES, said the newly announced center will serve as a single location for cyberthreat reporting and will help to ensure best practices are adopted across the state’s public and private sectors.
Thursday, 03 September 2015 00:00

Datto Expands Technical Support

Datto is now offering free technical support to its managed service provider (MSP) partners and end customers for all its products 24 hours a day, seven days a week.

The Norwalk, Connecticut-based backup and disaster recovery (BDR) solutions provider has expanded its customer support options to include Datto Backupify solutions.



Thursday, 03 September 2015 00:00

VMware Plays the Hybrid Cloud Card

When it comes to all things cloud, VMware has a vision that generally aligns with the way most internal IT organizations see the cloud computing world: The cloud is an extension of IT environments that will continue to run on premise for a very long time.

At the VMworld 2015 conference this week, VMware reaffirmed that vision of IT with the launch of a Unified Hybrid Cloud Platform that enables IT organizations to invoke object storage, a database-as-a-service offering based on Microsoft SQL Server and disaster recovery capabilities via the VMware vCloud Air cloud service. At the core of that offering is the EVO software-defined data center (SDDC) software, which VMware is building out as its base management platform.

Mark Chuang, senior director of product marketing and product management for VMware, says the primary VMware goal is to make it easier for IT organizations to consume a broad range of emerging technologies at a higher level of abstraction by using SDDC software. Without that capability, most internal IT organizations would not be able to absorb the costs associated with stitching all those technologies together on their own.



(MCT) - When you call 911, you're not likely to be at your calmest.

The Beaufort County Sheriff's Office new Smart911-- a program created to get information to dispatchers quicker -- could mean you get help faster throughout the county and on Hilton Head Island.

The website, unveiled Monday, allows residents and businesses to create a free, safety profile online that lists health information, names and photos of family members, pets in the home, floor plans, vehicle details and emergency contacts.

"We are trying to target as many residents as possible," Maj. David Zeoli, deputy division commander of the sheriff's office emergency management division, said. "The more the better."



This week, G DATA claimed that rogue retailers are installing malware on Android-based phones from China and selling them on the open market.

This is a supply chain issue, since the problem is occurring before the devices are sold, and so far, the issue has mostly impacted Chinese consumers—though some infected phones have been found in Europe. The malware has been found on more than 20 brands of mobile phones. The article from eWeek suggests that it “underscore(s) the current difficulties in securing technology as it moves through the supply chain to its destination.”

What we must realize is, supply chain issues are more widespread and have a far different profile than just a few shadowy characters intercepting crates of phones on a dock somewhere:

In 2013, classified documents leaked by former contractor Edward Snowden showed that the U.S. National Security Agency and other national intelligence agencies have regularly infiltrated supply chains feeding technology to countries of interest to compromise devices that act as electronic moles, according to the documents. Devices from Cisco, Dell and other manufacturers, for example, have all been modified in transit to their destination to include implants to enable NSA monitoring.



KANSAS CITY, Mo. – This is the first week of National Preparedness Month (NPM) and in the Midwest it’s off to a roaring start with active outreach and conversations meant to inspire individuals and families to take action and prepare for flooding—the most common and costly disaster in the United States. Yes, it can happen where you live!

During this first week of NPM, the U.S. Department of Homeland Security’s Federal Emergency Management Agency’s 10 regional offices; county and local emergency managers; other federal agencies; businesses; voluntary and other organizations; as well as families and individuals will use news releases, social media, educational activities and events to promote the message that preparing for floods is important for protecting lives, livelihoods and properties.

“Flooding is fresh on the minds of many people in Iowa, Kansas, Missouri and Nebraska. With so much flooding during the past few months, it’s a good time to consider the true risk,” said FEMA Region VII Regional Administrator Beth Freeman. “But it’s not enough to simply realize flooding is a real threat for us all. This month, this week, today, we hope everyone will take action to develop and practice a family emergency communication plan for hazards like flooding. This year our theme is, 'Don’t wait. Communicate. Make your emergency plan today.'”

Fewer than half of Americans have taken the time to plan what they will do if there is an emergency. Sitting down and developing a communication plan with loved ones doesn’t cost a thing, but can save a lot if a flood or another disaster impacts you and your family.

In addition to floods, hurricanes, wildfires, tornadoes and earthquakes also occur frequently and devastate lives across the country every year. To encourage disaster planning for all hazards, FEMA and the Ad Council just launched a new series of public service announcements (PSAs) in English and Spanish, at www.ready.gov/september,  The PSAs direct audiences to www.ready.gov/communicate for tools and resources to help develop and practice a family emergency communication plan.

Managed and sponsored by the Ready campaign, National Preparedness Month is designed to raise awareness and encourage Americans to take steps to prepare for emergencies in their homes, schools, organizations, businesses, and places of worship. National Preparedness Month is an opportunity to share emergency preparedness information and host activities across the country to help Americans understand what it truly means to be ready.

National Preparedness Month Weekly Themes

•Week 1 (September 1–5)  Flood

•Week 2 (September 6–12)  Wildfire

•Week 3 (September 13–19)  Hurricane

•Week 4 (September 20–26)  Power Outage

•Week 5 (September 27–30)  Lead up to National PrepareAthon! Day, September 30

National Preparedness Month culminates with National PrepareAthon! Day on September 30 when cities and counties across the country are planning community-wide events bringing together schools, their business community, government, faith leaders, hospitals, individuals and families, and others to participate in preparedness drills and activities for hazards that are relevant to their area.

For more information visit Ready.gov/September or follow the campaign on Facebook, at https://www.facebook.com/readygov, on Twitter, at https://twitter.com/Readygov,  or for FEMA Region VII, www.twitter.com/femaregion7. For more information about events for America's PrepareAthon throughout September, and for National PrepareAthon! Day information, visit www.ready.gov/prepare.

Quick facts to consider as you plan:

•Text messages and social media can be better ways to communicate during an emergency when phone lines are tied up, or even not working.

•Homeowners and renters insurance don’t cover floods

•Talking to children about emergencies and involving them in the planning process helps children feel they have some control over what could happen during an emergency. It can also make recovery much easier on everyone.

Follow FEMA online at www.twitter.com/fema,  www.facebook.com/fema,  and www.youtube.com/fema.   Find regional updates from FEMA Region VII at www.twitter.com/femaregion7.  Also, follow Administrator Craig Fugate's activities at www.twitter.com/craigatfema.  The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.

FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Thursday, 03 September 2015 00:00

A Value-Based Approach to Risk Management

CEOs drive their organizations to pursue opportunities with the objective of building and sustaining long-term enterprise value. It is what the Board of Directors expects. In the book Built to Last, one of the principles asserted by the authors is that a company sustains itself by setting “big hairy audacious goals” requiring the commitment of its personnel working outside their comfort zone.[1] That is exactly what a good CEO does. Everyone knows that the status quo is a non-starter in a rapidly changing environment. Anyone standing still is likely to get run over.

Within this context, what is the role of risk? Many argue that risk management should contribute value. While this assertion is easy to make, what does it really mean? And what is the Board’s role from a risk oversight standpoint to ensure a value-based approach?

There are two ways of looking at this topic: the strategic view and the proprietary view.



SAN DIEGO, Calif. – This is Part 3 in a series that explores the innovative and highly effective ways that organizations can strengthen their response to a cyber-attack. This series is written by CAPT. Mike Walls, former Commander of U.S. Navy Cyber Readiness and current Managing Director, Security & Operations at EdgeWave.

A professional Red Team is a group of highly trained specialists in a given field that can effectively analyze a problem from an adversarial perspective. Although their backgrounds are diverse, all Red Team members are at the top of their field, many of whom started out within the US Military's Cyber organizations. This piece will uncover who Red Team members truly are, what they do and why we should trust them.

With their help and expert insights, we can better prepare for tomorrow's battle against cyber criminals.

Click HERE to read the full blog post

Digital identities are being exploited on a routine basis by sophisticated cybercriminals, a new study from ThreatMetrix found.

No surprise there, but "ThreatMetrix Cybercrime Report: Q2 2015," based on attacks the security technology company detected between April and June 2015, uncovered a number of unforeseen trends.

For one, increasingly sophisticated attackers were increasingly targeting diverse data sets to effectively stitch together consumers' credentials. And new account creation continued to be at high risk as fraudsters use stolen credentials harvested from massive breaches.



Wednesday, 02 September 2015 00:00

Securing the Internet of Things

Cloud technologies have taken the business world by storm, bringing with them greater agility, flexibility, and cost savings. Unfortunately, it’s becoming more and more apparent that many IT professionals don’t understand how to effectively manage cloud usage in their companies. Cloud-based file sharing is a particular sore spot for IT departments that are trying to keep up with the ever-growing Internet of Things (IoT). 

The IoT is capable of enabling companies to create new revenue models and cut costs by connecting more “things” to the network that can collect essential data. With so much data being stored in the cloud, security is a serious concern for every business owner. After dissecting a recent Ponemon Institute study commissioned by data security specialist SafeNet, Nathan Eddy of eWeek wrote:

70 percent of respondents agree that it is more complex to manage privacy and data protection regulations in a cloud environment, and they also agree that the types of corporate data stored in the cloud—such as emails, and consumer, customer and payment information—are the types of data most at risk.



Wednesday, 02 September 2015 00:00

The Myth of the RTO

This is the 1st in a series of articles examining the “myths” of today’s Business Continuity Management industry. 

In a standard, methodology-driven BCM program, much of the industry follows the RA-BIA-Strategy-Plan Development cookie-cutter path, assuming that of all of these will lead to a viable and sustainable Business Continuity Planning program.  Industry ‘experts’ cling to this methodology mainly because the cookie-cutter approach is easy to follow.  But, does the outcome reflect the needs of the organization?

Recovery Time Objective (RTO), as a key driver of a BCM Program, must be examined.   In the early days, RTO was a useful indicator of recoverability.  BCM today has evolved from IT Disaster Recovery of the 70’s – when the focus was on restoration of mainframes.  Once you understood how long it would physically take to recover the mainframe, it was simple to set a Recovery Time Objective – based on that capability.



Tropical Storm Erika came along right on cue to bring thoughts of disaster readiness on the eve of National Preparedness Month.

Macon-Bibb County Emergency Management Agency Director Spencer Hawkins was tracking the storm last week.

"We're watching and waiting," Hawkins said. "We are watching this very closely."

Although the storm fell apart after assaulting the Caribbean, it was poised to track up through Macon and Middle Georgia.

If it had held together, Macon could have seen heavy rain, flooding and gusty winds capable of bringing down trees and power lines.



(TNS) - When Hurricane Katrina battered the Gulf Coast, Georgia threw open its doors, exposing the state’s character as it improvised to help tens of thousands of needy evacuees. But the storm also exposed weaknesses and gaps in the state’s emergency operations, with lines that stretched as long as football fields and confusion about who was in charge of what.

Government and non-profit groups here say they have tackled shortcomings that hobbled their response to the huge storm a decade ago. But some systemic challenges remain that could hamper the state’s ability to handle a future disaster, according to a review by The Atlanta Journal-Constitution of public records and interviews with state, local officials and non-governmental leaders.

The early response to Katrina in Georgia was plagued by communication missteps and turf battles. In particular, the region’s balkanized government made it tough to coordinate a response. The same problems, according to experts, became apparent during the 2014 ice storm that paralyzed the region.



(MCT) - The night of June 26, 72-year-old Robert Miller was wading through a foot and a half of standing water in his garage despite an infection on his leg, trying desperately to salvage his belongings that were damaged in a 100-year rainstorm.

"You ain't got time to think," Miller said. "You're just running, trying to grab."

The Waverly Road resident was out of town when the 1,000-year storm hit Jeffersonville a little more than two weeks later. He said if his daughter, Karen Wigginton, and her husband hadn't come to his empty home, he would have lost his cars and countless other possessions.



Since its beginning in 2004, National Preparedness Month is observed each September in the United States. Originally created by the Federal Emergency Management Agency (FEMA), the campaign encourages people to make plans and preparations for emergencies in their homes, businesses and communities.

While it can be argued every month is a “preparedness month” for business continuity, IT and disaster recovery professionals, September is nevertheless a good time to take stock of contingency plans and communicate important resiliency concepts to employees, suppliers and other stakeholders. Here are a few things you might consider in order to take advantage of the national focus on preparedness.

Encourage employees to have a family preparedness plan.

Business continuity plans rely on people to carry them out. The expectation is individuals will be available and willing to address the business crisis at hand. However, if the safety and security of employee’s families is uncertain, it will difficult, if not impossible, for them to focus on work-related responsibilities. Having a family preparedness plan in place will provide the employee with a measure of comfort and security, and as such, should be encouraged by the employer.



WASHINGTON  – Disasters like floods, hurricanes, wildfires, tornadoes, and earthquakes are a harsh and frequent reality for much of the country. According to a recent survey conducted by FEMA, progress has been made; however, fewer than half of Americans have discussed and developed an emergency plan with their household.

Today, the Federal Emergency Management Agency (FEMA) and the Ad Council launched a new series of public service announcements (PSAs) to encourage families to develop an emergency communication plan before a disaster occurs. An extension of the national Ready campaign, the new PSAs launch in conjunction with the 12th annual National Preparedness Month, serving as a reminder to take action to prepare for the types of hazards that could impact where you live, work, and vacation.

"The last thing you want to be worried about during a disaster is how to communicate with your family members," said Administrator Craig Fugate. "Have that conversation today. It doesn't cost a thing."

The new campaign includes English and Spanish-language TV, radio, outdoor, print and digital PSAs. Created pro bono by Chicago-based advertising agency Schafer Condon Carter, the PSAs illustrate the importance of having a family plan in the event of an emergency by showing real emergency moments and asking the question, “when is the right time to prepare?”  The viewer is encouraged to develop a family emergency communication plan through the clear message, “Don’t wait. Communicate.” The PSAs direct audiences to Ready.gov/communicate for tools and resources to help develop and practice a family emergency communication plan.

“Through the Ready campaign, we’ve made a lot of progress educating and empowering Americans to prepare for all types of emergencies but there are still so many families that don’t have a plan, said Lisa Sherman, President and CEO of the Ad Council. “Having these conversations is really important and can have a big impact on our families’ safety in the event of a disaster.”

“SCC is honored to work with the Ad Council and FEMA on the Ready campaign,” said David Selby, President and Managing Partner of SCC. “This new campaign provides powerful imagery and a critically important call-to-action that we hope will cause individuals and families to pay attention, lean in and, ultimately, take action.”

Localized television and radio PSAs were created and will be available for 27 states, Guam, the U.S. Virgin Islands, Washington D.C., and New York City as part of an ongoing collaboration with state and local emergency management partners. These PSAs drive audiences to their local organization’s website for resources and information pertinent to their area.

As an extension of the national Ready campaign, versions of the PSAs were created for Ready New York, a local initiative that was launched in partnership with the New York City Office of Emergency Management in 2009. Tailoring the message to the unique challenges faced by people living in New York City, audiences are directed to call 311 or visit NYC.gov/readyny, where they can find preparedness resources, including 11 Ready New York guides in 13 languages and audio format.

Managed and sponsored by the Ready campaign, National Preparedness Month is designed to raise awareness and encourage Americans to take steps to prepare for emergencies in their homes, schools, organizations, businesses, and places of worship. National Preparedness Month is an opportunity to share emergency preparedness information and host activities across the country to help Americans understand what it truly means to be ready.

National Preparedness Month Weekly Themes

  • Week 1 (September 1–5)  Flood
  • Week 2 (September 6–12)  Wildfire
  • Week 3 (September 13–19)  Hurricane
  • Week 4 (September 20–26)  Power Outage
  • Week 5 (September 27–30)  Lead up to National PrepareAthon! Day, September 30

National Preparedness Month culminates with National PrepareAthon! Day on September 30 when cities and counties across the country are planning community-wide events bringing together schools, their business community, government, faith leaders, hospitals, individuals and families, and others to participate in community-wide preparedness drills and activities for hazards that are relevant to their area.

Since the launch in 2003, the Ready Campaign has received nearly $1.2 billion in donated media. The Campaign helps to generate more than 92 million unique visitors to Ready.gov. The Ad Council is distributing the new PSAs to media outlets nationwide this week, and the PSAs will run in donated time and space.

For more information visit Ready.gov/September or follow the campaign on Facebook and Twitter. For more information about National PrepareAthon! Day, visit www.ready.gov/prepare.

Wednesday, 02 September 2015 00:00

Why All Businesses Need an Emergency Response Plan

Hardly a day goes by that we don’t hear a news story involving an emergency: weather catastrophes, fires, intense medical situations and occasionally the angry gunman. If any of these situations occurred in your place of work, would the staff know what to do, who to contact or where to go? And afterward, would anyone have information about how to access critical documentation and company information if the business was destroyed or otherwise unable to be open for business?

A detailed emergency response plan provides guidance for employees in the event of a disaster. Having a plan will help workers make it through the chaos and to a safe location or assist personnel in getting medical assistance should it be needed.



Wednesday, 02 September 2015 00:00

Three Steps to Protecting Your Clients' Business

As a whole, small businesses are tremendous economic engines–creating jobs, stimulating growth and fostering innovation. Unfortunately, their size makes them especially vulnerable to catastrophic incidents such as fire, flood and extreme weather events, as well as to more localized problems including equipment failures, theft, and cybercrime.

Anxiety about the impact of these risks on SMB clients can keep MSPs and IT solution providers awake at night, and rightly so. The cards aren’t stacked in the SMBs’ favor. For example, according to the Federal Emergency Management Agency (FEMA), 40 percent of businesses don’t reopen after a disaster, and another 25 percent fail within one year following the catastrophe. Fortunately, MSPs and IT solution providers are uniquely qualified to help their SMB clients prepare for any impending disaster, whether natural or man-made, and have the tools and processes in place to ensure that catastrophes–should they strike–don’t spell the end to the business.



The never-ending quest to achieve trusted IT advisor status gets a lot of attention, but MSPs need the right strategy to get there. For some providers, the path to IT trusted advisor leads through the NOC (network operations center) and data analytics.

As MSPs mature and master remote service delivery, the imperative to constantly find new and different ways to add value for customers never goes away. If anything, it becomes more pronounced because of the inevitability of commoditization.

Partnering with a NOC vendor to deliver round-the-clock support and routine functions, such as patch management and systems maintenance, enables you to better focus on the consultative part of the business.



Tuesday, 01 September 2015 00:00

Man-Made Disasters, Global Impact

A New York Times article over the weekend takes a behind-the-scenes look at the recent deadly blasts at the port city of Tianjin in China.

The series of explosions and fire that began at a hazardous chemicals storage warehouse in the Binhai New Area of Tianjin August 12, leveled a large industrial area, leaving at least 150 dead and more than 700 injured.

As reported by the NYT, the lack of safety and oversight at the third largest port worldwide is shocking.



Shadow IT is happening in your company, and it is causing serious security problems.  My IT Business Edge colleague Arthur Cole reported on a recent Cisco study that found that CIOs really are in the dark when it comes to shadow IT, with the use of the forbidden apps 15-20 higher than anticipated:

On average, the report states, IT departments estimate their companies utilize about 50 cloud services while in fact the number is 730. And the discrepancy between reality and perception is growing quickly: One year ago, it was 7x, within six months it had jumped to 10x. At this rate, the number of shadow apps could top 1,000 for the average enterprise by the end of the year.

So, we know that employees are turning to shadow IT and we know that it is a serious security risk. That leaves the most important question: Why? What’s happening in the work place that has employees going rogue with their applications?



Aug. 29, 2015 marked the 10-year anniversary of Hurricane Katrina. During the storm and the ensuing chaos, 1800 people lost their lives in New Orleans and across the Gulf Coast. Many of these deaths, as well as the extensive destruction, could have been avoided or minimized if there had been better planning and preparedness in anticipation of just such an event, and if there had been much better communication and collaboration throughout the crisis as it unfolded. Responsibility falls on many from government officials (at every level) to hospitals to businesses to individuals. If there is any silver lining to such a destructive event, it’s that it forced many in the US to be much better prepared for the next major catastrophe. Case in point, in October 2012, Superstorm Sandy barreled through the Caribbean and the eastern US, affecting almost half of the states in the US. The storm caused unprecedented flooding and left millions without access to basic infrastructure and thousands without homes, but this time, about 200 people across 24 states lost their lives.



Tuesday, 01 September 2015 00:00

How COSO Destroyed Risk Management

“The Committee of Sponsoring Organizations of the Treadway Commission (“COSO”) was organized in 1985 to sponsor the National Commission on Fraudulent Financial Reporting, an independent private-sector initiative that studied the causal factors that can lead to fraudulent financial reporting. It also developed recommendations for public companies and their independent auditors, for the SEC and other regulators, and for educational institutions.

The National Commission was sponsored jointly by five major professional associations headquartered in the United States: the American Accounting Association (AAA), the American Institute of Certified Public Accountants (AICPA), Financial Executives International (FEI), The Institute of Internal Auditors (IIA), and the National Association of Accountants (now the Institute of Management Accountants [IMA]). Wholly independent of each of the sponsoring organizations, the Commission included representatives from industry, public accounting, investment firms, and the New York Stock Exchange.



The Weather Company delivers, on average, 15 billion weather forecasts to consumers and businesses every day. That’s an increase of more than 25-fold in the past five years, says Mark Gildersleeve, president of the business division of The Weather Company, which also owns the Weather Channel. The Weather Company is partnering with IBM to deliver those forecasts in real-time for 2.2 billion locations across the globe – a feat that would have been unthinkable without the recent advancements in cloud, mobile and data analytics. The Smarter Planet caught up with Gildersleeve to talk about how these new tools and technologies have improved forecasting and changed his business.



By Saul Haro

Sometimes the missing pieces of a puzzle can be right in front of you.

That’s how it was for me and my colleagues a few years ago. We were working in the supply chain and import/export group of a major automotive parts manufacturer and tasked with making sure operations moved smoothly.

It goes without saying that the automotive industry is huge, with hundreds of suppliers contributing parts and services to a single vehicle. But for context, consider that Toyota estimates the average car consists of about 30,000 individual parts – parts that have to be ordered, procured, shipped, delivered, received, installed, and tested. In this light, it’s easy to understand just how important managing the supply chain process can be to a successful production process.



So, I am writing a complete Business Continuity Planning Governance Guide and Standards manual for one of my clients and it dawned on me that this process really is a neat little building block methodology that might best be simply explained through a “Twelve Days of Christmas”-like presentation.

This is NOT part of the manual I am creating, but, I thought I might share it with you.  So … here goes …