One key aspect of disaster recovery planning has traditionally been about returning to business as usual after a natural disaster. But what if the effects of natural disasters could be avoided altogether? Is it time for the serious corporate contingency planner to consider proactive planning efforts geared toward mitigating or even preventing natural disasters? The authors believe it is, and that such a planning effort is the next step in what is arguably the 60-year history of contingency planning.
In this article we describe practical science and technology (S&T) that can make the goal of protecting commercial organizations from natural disasters a reality.
Natural Disaster Planning, a New Frontier
If natural disasters are defined as the intersection of phenomena and presence, today’s S&T can be employed to predict and possibly limit the overlap of these two primary risk components, thereby reducing their impacts. How can S&T reduce risks? Well for starters, the elite recovery planner could be doing a lot more to model actual events (not just assumptions) given the thousands of data sources available today. Use of statistical and historical data, satellite imagery, and today’s computing technology, combined with assumptions that accurately estimate an organization’s ability to recover, is redefining disaster management. Such studies not only indicate what types of companies are most at risk but also quantify precisely how they are at risk. The results can better support planning mitigation and policy decisions.
For example, before locating costly manufacturing plants and operations centers, S&T allows firms to review the demographics, topography and other infrastructure first. Firms looking to locate call centers overseas in places like India might want to check routing of undersea cables and whether the physical cable or terminating facilities are susceptible to seismic events or sabotage. Energy companies might want to weigh such a study against the all too probable prospect of $6 per gallon gasoline if a major refinery were struck by an earthquake or Category 5 hurricane. There is also a bonus if today’s technology is used correctly: The ability to communicate abstract concepts to non-technical executives. The right S&T can actually visualize risk and risk reduction options.
Contrary to popular belief, executive management will fund a contingency planning effort because the math on return on investment (ROI) is squarely in favor of risk reduction. Capitol losses, productivity losses, and client losses associated with business resumption after a disaster are all too great in comparison to a risk study to minimize impacts. Management just needs to know that the project is not money into a black hole. By visualizing what would otherwise be complex and voluminous data as well as presenting it in an understandable format from which sound conclusions can be drawn, S&T can help get your efforts funded as well.
While the ability to visualize data streams from sensors, satellites, buoys, and other sources is time-consuming for the recovery planner, it is absolutely compelling when completed and presented to policymakers.
Are Disasters on the Rise?
Or Does It Just Seem That Way?
Hazards such as severe storms, flooding and drought, high winds, landslides, wildfires and other weather-related events are causing untold suffering. One of the six strongest hurricanes ever recorded, Hurricane Katrina, hit the United States Gulf Coast on Aug. 29, 2005, taking more than 1,800 lives and doing an estimated $81.2 billion in damage. Over recent decades, there has also been a marked increase in the damage caused by geophysical hazards such as landslides, volcanic eruptions, earthquakes and resulting tsunamis. So, are disasters on the rise or do we learn of them more these days?
One thing for certain is that a large part of the increase in damage is due to the fact that there continues (and will continue) to be more people, business enterprises, and development in places that are at risk. Wildfires cause more problems in densely populated Southern California. Hurricanes do much more damage when they hit seacoasts that had previously been uninhabited but now contain vast new developments. As a result a higher number of events are reported in detail, more lives are lost, and more businesses affected or destroyed. Numbers alone cannot tell the story, but they help us to picture the sheer scale of the problem that disaster managers face. In 2006, disasters killed 23,000 and cost more than $34.5 billion. Now there is a sound bite for you.
Disasters Cause More Than Physical Damage
Damage to buildings and infrastructure from natural disasters obviously has an adverse impact on a business. The physical damage however is only half the story. Research of these events over the past decade demonstrates that a business need not suffer physical damage from a disaster in order to suffer business losses and subsequent failure. Many businesses that experience no physical damage are still eventually unable to continue business. In such cases, undamaged firms often find themselves without suppliers and/or customers for their products and services. These undamaged firms were victims of what has been referred to as “ruptured relationships” – relationships that were essential to the business that are now gone by virtue of the disaster.
Following a major disaster, it is not at all uncommon for local businesses to lose customers. The customers move out and may or may not be replaced. If they are replaced, it is often by people with different preferences or buying habits. Even those who stay in the damaged area often spend their available money to repair or replace their homes, and not necessarily to buy a new jet ski, flat screen TV, or meal in an expensive restaurant. Another reason for losses to “undamaged” companies stems from a phenomenon referred to as “tight coupling.” Tight coupling is a systemic relationship between two or more components in which there is little or no “slack” in the relationship. A simple example is a Just-In-Time (JIT) relationship between a supplier and a manufacturer. Organizations that rely on JIT for delivery of goods and services from a single supplier are efficient when everything works as it is supposed to work. However, when a disaster slows or interrupts delivery, both parties suffer along with potential customers who must find a replacement or do without the required services.
So what does this mean to the corporate planner? It might mean business decisions such as choosing warehousing in lieu of, or in addition to, JIT. In disaster prone areas, warehousing might make more sense than just-in-time delivery because it provides requisite slack in the distribution system. Production costs go up by the cost of warehousing, but are offset in a disaster by having goods available to be able to conduct business. That decision however, like any business decision, relies on accurate information about the probability of a natural disaster. There are places where such data and tools are made available for those inclined to look for it.
The Pacific Disaster Center (PDC) archives terabytes of up-to-date and historical data ranging from seismic events, to storm tracks to demographic and infrastructure data. The PDC has also devised tools to draw useful conclusions from that data. One is the Global Edition of their Natural Hazards and Vulnerabilities Atlas which is surprisingly at present available free of charge The PDC atlas is designed to aid the contingency planner in examining populations and infrastructure that may be at risk due to natural hazards in a specific region. The atlas also provides planners in the public sector (state, federal, local) a medium for understanding the type, frequency and severity of hazards that may threaten their jurisdictions. The center has traditionally been utilized to produce high resolution vulnerability analysis that help reduce the overall risks.
Up until now however, the focus has been largely on governments and government agencies. Use of these skills and methodologies for commercial enterprises is something rather new, and potentially exciting. The tools and methodology can easily be adapted to virtually any kind of disaster or location. Let’s step through one simple example. If you want to try this yourself, log in now for free at http://www.pdc.org/DRJ.
Step 1 – Examine the Potential for Disaster
Through use of the atlas, historical records of disasters are used in the first step to examine the potential for these events. In Figure 1 below, the locations of major earthquakes and their intensity are viewed. (Insert shows a higher resolution perspective for the Southeast Asia region. The earthquake intensity layer, which underlies this diagram maps the Modified Mercalli (MM) intensity that is expected, with a 20 percent probability, to be exceeded during a 50-year period for a given location. The 50-year period represents the average design life of a building..
In Figure 1, for example, dark green areas correspond to intensity expectations of V and below, while orange areas have a 20 percent probability of experiencing an earthquake intensity of IX or higher over the next 50 years. In the most seismically active areas, the symbols representing epicenter locations almost completely obscure the underlying intensity layer at this scale. A “zoom” tool function of the atlas can be used to examine these data at a higher resolution. Additionally, the “identify” tool and the “select” tool can be used to view details, including magnitude and date, of recent or historical earthquake events.
Cyclones can be investigated in much the same way by using the same tool. Figure 2 shows tropical cyclone intensity zones for the same region. These data map areas that have a 10 percent probability of experiencing a tropical storm of a given intensity during a 10-year period. Darker shades of blue correspond to higher storm intensities. Additionally, a current tropical storm Chan-Hom, off the west coast of the Philippines is shown.
Step 2 – Where is Critical Infrastructure?
Ok, now let’s take these data and compare them to the location of critical infrastructure where we might propose to locate a new business. Using these tools, the relationship between these hazards and potentially impacted resources can be observed by displaying the hazards along with population centers, roads, railroads, and airports. In this example, Figure 3 shows earthquake risks, along with transportation infrastructure, for a portion of the Philippines including Manila. Figure 4 shows tropical storm risks for the same area.
As can be seen in Figure 3, the frequency and severity of earthquakes in and around Manila places much of the region in the top two categories of earthquake intensity. Most of metro Manila can expect an earthquake of intensity VIII or higher once per 50-year period (20 percent probability) while the remaining eastern region can expect earthquakes with even higher intensity, IX, during the same time frame.
As Figure 4 shows, this same area is also frequented by tropical storms, and falls within the top end of the storm intensity zones. By creating a virtual overlay of earthquake and tropical storm risks in Figure 5, it is possible to draw conclusions, such as, that the western portion of Borneo Indonesia, faces relatively low risk while other areas such as the Philippines face a high risk to both earthquakes and tropical storms.
Step 3 – Where Are the People?
The next step in a multi-hazard risk assessment process is to look at where the people and infrastructure are. The Atlas allows the contingency planner to analyze the varying degrees of potential exposure of people and infrastructure to the hazards present in a region. The Atlas contains both population density and transportation infrastructure (i.e., roads, railroads and airports) data layers. By combining population density (Figure 6) with transportation infrastructure (Figure 7), one can examine the relative magnitude (or density) of people and infrastructure exposed to potential harm from future occurrences of earthquakes and tropical storms.
Step 4 – Pulling It All Together
Figure 7 above shows transportation and other infrastructure, including roads, railroads, and airports, that are subject to natural hazards. This infrastructure is key to a region’s ability to recover following such a disaster. The Atlas supports assessment of the vulnerability of these critical assets for various hazards.
In Figure 7 it is superimposed over population density data from Figure 6, which in and of itself makes for a compelling visual. The categorized population density data, along with the transportation infrastructure data, are analytically combined with the multi-hazard index to identify those areas with a high potential for impact from natural hazards. Specifically, areas with a high hazard index and a high number of people and infrastructure are those that would warrant the most attention from mitigation efforts as well as to be expected to require significant resources during response/recovery phases of a natural disaster.
Other Hazards
While we typically think of earthquakes and storms when we think of disasters, recent events have demonstrated the economic impacts of pandemics including SARS, Avian Influenza (H5N1), and “Swine Flu” (H1N1). Recognizing this, PDC has incorporated these types of threats into their global atlas. Figure 8 shows recent confirmed cases and deaths due to H1N1, an event that brought the world’s largest city, Mexico City, to a near standstill for more than a week.
Summary
Using resources like these, it is possible to produce up-to-date information on floods, wildfires, tropical storms, earthquakes and volcanoes that is dynamically integrated with a comprehensive set of historical records of other natural hazards and disaster events throughout a region. In this manner it is possible to characterize hazards, as well as truly compute their probability.
In addition, data on people and infrastructure can be overlaid and provides the basis for understanding exposure, vulnerability, and potential impacts from these hazards. This resultant data and diagrams in turn can be used to gauge the vulnerability of anything from a new beachfront hotel complex, to a potential factory location and to communicate them effectively to management.
Leo A. Wrobel has more than 25 years experience with a host of firms engaged in banking, manufacturing, telecommunications services and government. An active author and technical futurist, he has published 12 books and over 500 trade articles on a variety of technical subjects. Wrobel served 10 years as an elected mayor and city councilman and is a sought-after speaker. He has lectured throughout the United States and overseas and has appeared on several television news programs. Wrobel is presently CEO of Dallas-based I b4Ci Inc. Portions of this article were adapted from Wrobel’s latest book, “Disaster Recovery for Communications and Critical Infrastructure” coming soon in 2009 from Artech House Books.
Sharon M. (Ford) Wrobel conducted extensive publishing and regulatory research for her former employer (a 50-state telephone company), a function she continues today as vice president of Business Development for b4Ci Inc. Wrobel was a major content contributor to Leo Wrobel’s 2009 book “Business Resumption Planning Second Edition” and was co-author of “Disaster Recovery for Communications and Critical Infrastructure.” She has published over a dozen trade articles.




