DRJ's Spring 2019

Conference & Exhibit

Attend The #1 BC/DR Event!

Winter Journal

Volume 31, Issue 4

Full Contents Now Available!

River Oaks Bank Fire Forces Implementation of
Untested Disaster Recvovery Plan

Thursday, June 6, 1991, was a normal banking day at the River Oaks Bank in Houston, Texas, until 7:30 that evening when fire alarms sounded.

The shrieking alarms forced employees from the building and company officials into a disaster recovery mode — even though the company’s disaster recovery and business continuity plan wasn’t fully operational at the time and had never been tested.

“We had just converted to our current system (IBM AS-400) in November, 1990,” explained John Q. Kershner, senior vice president of River Oaks Bank. “We were scheduled to do a test of our disaster recovery facility in July. Instead, we got to do our test early — in a live mode.”

The fire at the 14-floor facility began on the eighth floor where remodeling of offices had been taking place.
According to Kershner, fumes from varnish which had been used on wood paneling in the offices accumulated and ignited.

The three-alarm blaze that followed sent smoke billowing from the building and forced the construction and clean-up crews and the company’s data processing staff to leave the building immediately.

As fire ripped through the heart of Bank of the Sierra early Oct. 2, bank officials enacted a disaster plan to keep the five-branch independent up and running. The 1 a.m. blaze gutted the bank’s corporate offices, destroying its mainframe computer and central data bank containing customer account information.

But because of past banking disasters going back to 1982, when a blaze toppled a large Minneapolis bank, Bank of the Sierra had a formula to follow. “Even as we stood and watched it burn, we mapped out who we had to call and what we had to do,” said Jim Holly, bank president.

The bank’s disaster contingency plan, crafted by a Minnesota-based company, promptly came into play. “Some things we would have done intuitively, but the plan compressed the time it took to do them,” Holly said. In the last year, department heads had become familiar enough with the 150-page document that they could quickly identify key people, critical tasks and required equipment.

Joel Arel, certified disaster recovery planner and president of Minnesota-based Arel Technologies Inc., flew in to meet with bank officials Oct. 3. “Our disaster contingency plan worked because people had already thought the event through,” Arel said. This is the first time one of Arel’s 750 clients has suffered a major fire. “It’s a rare occurrence, sure, but it’s also an absolute reality," Arel said.

Arel is the first to say that contingency plans are only as good as the people who set them in motion. Bank of the Sierra officer Dave Mello agrees. “Everyone pulled together and pitched in. People worked around the clock,” he said.

Using backup data nine hours after the blaze began, tellers stood at branch windows at opening time, 10 a.m., ready to transact business as if nothing had happened, as if the blaze had just blown smoke.
Behind the scenes, however, department heads scrambled to resolve a crisis. “The daily transactions weren’t a problem. The problem was in the back room,” said Gilbert Small, bank vice president and CEO. “The challenge was to process the work once we received it over the counter.” Cash deposits, loan payments, interest accruals, all the data channeled daily through Bank of the Sierra’s computer center before the fire would have to be rerouted in the aftermath of what fire officials called the worst fire in city history.

Bank of the Sierra, with assets of $160 million, is the largest independent bank in Tulare County with 18,000 deposit accounts and 10,000 loan accounts. Before the fire, all five branches were connected to the bank’s computer center by telephone lines. The computer center processed between 25,000 and 40,000 individual entries a day.

In the two weeks before the fire, customers had flocked to the bank to refinance loans because of low interest rates, adding to the mounting pressure on the bank to remain current in its accounts.

Bank of the Sierra contracts with Bank Up, a data processing hot site. By 7 a.m., Oct. 2, six hours after the fire started, the San Ramon-based company was ready to receive by courier Bank of the Sierra’s boxed up daily entries.

Meanwhile, a Denver company, Data Assurance, provided Bank of the Sierra with a mainframe computer. Entries posted in San Ramon were downloaded in Denver, printed and flown back to Porterville daily.
Six days after the fire, the bank had caught up. Postings were current by all accounts--and right on schedule according to the bank’s disaster plan.

By Small’s thinking, without the plan, the back room recovery might have taken another week still. “And for the bank and some of our customers, that could have been a critical week,” he said.

In the five days after the fire, bank employees and a file restoration team sifted through the rubble and were able to salvage key documents. Important account histories stored in $3,000 fireproof file cabinets lined with volcanic material were saved. Water-damaged files were freeze-dried and shipped to a file restoration center in Heyward.
By Oct. 28, a Virginia company was completing restoration of the last of key computer tapes damaged in the blaze. On the computer tapes were over-the-counter deposit transactions made the day of the fire. The deposits were being processed when fire erupted and ripped through the computer center.

In all, between 800 and 1,000 deposit transactions were still unaccounted for, more than two weeks after the fire. “As you begin to reach a final assessment, gaps begin to crop up,” Holly said.

While waiting for tape restoration work to be completed, bank officials worked with customers to reconstruct the deposit transactions. In the meantime, the bank simply paid all checks on good faith, a policy Holly wasn’t eager to release in the weeks after the fire.

Contracts with Bank Up and Data Assurance evolved out of contingency planning last year.

For banks, disaster contingency plans are mandated by federal regulators. Small said banks became more vulnerable to fire during the rise of the computer age because information was centralized and confined. “If the confined area is struck, more information is struck,” Small said.

Disaster contingency became a focal issue in banking after a Thanksgiving Day 1982 blaze that destroyed headquarters for Northwest National Bank in Minneapolis, Minn., what Arel calls the worst bank fire in history.

Reprinted with permission from The Porterville Recorder, Porterville, Calif. Mark Phillips is the newspapers’ business editor.

This article adapted from Vol. 5 #1.

The basic premise of disaster recovery is that a tested plan is the only way to recover. Penn Mutual Life Insurance Company recovered from the fire because, one, they had tested their plan many times prior to the fire, and two, they executed their disaster recovery plan with expertise and precision.


The alarm was pulled shortly after 4 p.m. on Tuesday, May 30, 1989, when smoke was discovered in the records room on the ninth floor. The fire raced through the ninth floor of Penn Mutual's 530 Walnut Street building in downtown Philadelphia, destroying thousands of documents. At times the temperature hit 2,000 degrees and fire spread quickly among the room's largely paper contents. By early Wednesday morning, it had escalated to a 9 1/2 alarm fire and eventually as many as 500 firefighters were required at the scene. The fire had displaced about 1,500 employees from various companies in the building. Arson is suspected and a reward has been offered.


We interviewed Paul Trainor, Vice President of Information Systems, about the fire and recovery. The fire was on the ninth floor which is two floors above the data center. The fire had started in the Penn Mutual records center and continued to burn for two days.

The firefighters were pouring 12 million gallons of water on the fire, and this flowed down to the data center destroying the ceiling tiles and causing severe water damage to the computer equipment. The plastic sheets used to cover the equipment were ineffective due to the enormous amounts of water.

At approximately 9 p.m., Mr. Trainor decided they could not continue and declared a disaster to SunGard Recovery Services. (SunGard Recovery Services provides alternate data processing facilities and services in the event of a computer disaster). Penn Mutual's backup tapes arrived at SunGard's Philadelphia Recovery Center at 1 a.m. and at about 9 a.m. the data was restored and by 11:55 a.m. Wednesday morning, they had every application up and running with the exception of two minor internal tacking systems, which were brought up within two hours.

Due to the complexities of partial backups, Penn Mutual had changed their philosophy from partial backups and defining critical applications, to performing full backups.

Their goal was to restore the system and begin operation at the recovery site within 24 hours. As a result of the testing they had done previously, and with the help of SunGard's skilled professionals, they were able to recover the operating environment and key applications within 13 hours.


Nearly two years ago, Penn Mutual moved most of its business staff out of the Philadelphia location to a suburban site 20 miles away. They had to establish communications to those offices. The company uses a T1 circuit and dial backup alternatives to communicate with nationwide agency offices in a recovery mode. The communications equipment at SunGard were able to handle all of Penn Mutual's communication needs. They used SunGard's SunNet II modems locally and sent others to the outlying branches.


Mr. Trainor stated they had regularly tested their recovery capability and had just completed a test two weeks prior to the fire. The test also familiarized Penn Mutual with SunGard's facilities and personnel. He also commented, "If it hadn't been for our vigorous testing program, we would have had an extended outage. There's no question of that!"


The effect of the fire on the other parts of the company was minimal. Ninety eight percent of the administrative and customer relation functions and personnel were at other locations.


The command center set up at SunGard was the single point of contact for the outside world. All calls came through command center personnel and could be handled in an orderly manner. Questions were handled about the fire, the data processing center and how long operations would be down. The command center was a place to implement the recovery plan.


It was apparent to Penn Mutual that the outage was going to be long term and that they needed to start the transition to SunGard's cold site. A major problem, according to Mr. Trainor, was that he had to acquire a complete data center in a relatively short period of time. They had to find and acquire 170 gigabytes of DASD. Short-term leasing is very expensive. Mr. Trainor said, "The first couple of days you are in total shock. Then you realize that you have to populate your cold site. The main question at that point was which vendors could deliver on time and which ones could not. In general, the larger equipment suppliers all did an exquisite job and some of the smaller ones did not."


A key issue that must be addressed when reconstructing your DP environment is determining the insurance settlement. If your equipment is not totally destroyed by fire, you might not get full settlement for the equipment, but yet you still have to acquire equipment for the cold site.


There have been many important issues mentioned in this article. Penn Mutual stresses that the major reasons for their successful recovery were: (1) preparing a disaster recovery plan (2) subscribing to SunGard (3) testing, testing and more testing.

Thanks to Penn Mutual's efforts and their comprehensive, thoroughly tested recovery plan implemented by Penn Mutual, in conjunction with SunGard's professional staff, they recovered successfully from what could have been a devastating disaster.


This article was written by Richard Arnold, editor-in-chief, Disaster Recovery Journal.

This article adapted from Vol. 2, No. 3, p. 4.

At 10:30 p.m. the night of May 4, 1988, Los Angeles’ worst high-rise fire swept through the 62 story downtown headquarters of First Interstate Bank destroying floors 12 through 16. In addition to this severe damage that destroyed five floors, smoke and water damage on the remaining floors meant that the entire building had to be closed indefinitely.

Senior management and bank security were notifed and in motion by 11 p.m. One member of the Business Resumpiton Planning (BRP) staff was having dinner in a restaurant just one-and-a-half miles from the building, and was alerted to the fire by the fact that the restaurant patrons kept looking out the window. When he saw the fire, he called BRP management, and then proceeded to the burning building for an eyewitness report. He reported on the phone at 11 p.m. that it looked as if the entire building was going to burn. By midnight, bank security, senior management and the BRP staff were in place in the Emergency Operations Center (EOC) located in the First Interstate Operations Center just seven blocks from the burning building. By 1 a.m., the height of the fire, representatives from the critical operational areas were either represented in the EOC or were notified by phone. As floor after floor burned, covered in graphic detail on television and monitored from the EOC, the planning for business recovery was well underway.

February 1, 1988, a relatively quiet Monday morning, marked the beginning of the second semester of classes at Ferguson Middle School.

Overcast skies and cool temperatures surrounded the three campus buildings known as the West Building, East Building, and the Annex.

Between 8:30 a.m. and 9:30 a.m., the office noticed some difficulty with the intercom system. Calls between office and classrooms were difficult and “call in” lights came on without intent. Custodians were notified and asked to check into the problem. About 9:45 a.m., two students from our A/V Lab, located in the West Building, came into the office indicating that they smelled smoke in the hallway. An Assistant Principal immediately left his office and headed to the West Building. At 9:50 a.m., the bell rang for students to be dismissed to their third hour class. A teacher in the hallway opposite the A/V room saw smoke in the ceiling around a fluorescent light fixture. As her students came into class, she noticed the amount of smoke was increasing. She, unsuccessfully, attempted to call the office. Realizing that the intercom was out of order and smoke was increasing, she began telling students to calmly exit the building. The Assistant Principal on the scene attempted to notify the office of the smoke as did several teachers. At approximately 9:55 a.m., the room-to-room intercom system was completely inoperable. Again, alerted to the situation by students sent by teachers, the secretary made an “all call” announcement asking students to exit the West Building immediately.

At this time, the “all call” system of the intercom was still functional. Responding to the first teacher who noticed smoke and the “all call” students and teachers exited the building as quietly and calmly as possible. All students were out of the building by 9:57 a.m. thanks to efficient teacher-to-teacher communication. At 9:58 a.m. the secretary placed a call to the Ferguson Fire Department and notified the Superintendent’s Office of the fire. The fireman arrived on the scene at approximately 10:03 a.m.

While the smoke continued to increase, all students from the West Building were directed to move into the East Building gymnasium. One administrator and the Fire Chief went through the West Building checking rest rooms and all classrooms to make sure that no students remained. At approximately 10:08 a.m. the halls of the West Building were smoke-filled and all students had been safely evacuated into the East Building gym. From 10:10 a.m. to 11:05 a.m. the fireman tore holes in the attic and roof and walked the halls trying to locate the source of the fire. Their attempts were unsuccessful. At approximately 11:10 a.m., the heat had grown to such an enormous temperature in the attic that the entire roof exploded into flames. Numerous fire trucks, news media vans, and interested community members assembled to view the eigtht alarm blaze.

Even before busses were called to dismiss school, many of the bus drivers (who had completed their routes and were on their way home) heard the news announcements regarding the fire at Ferguson Middle School. Realizing the need for transportation, enough drivers returned to the bus depot to help transport students home. By 11:30 a.m. busses had arrived and all students left Ferguson Middle School not knowing when they would return. From 11:30 a.m., teachers, parents and students from the community stood in amazement watching the tremendous devastation occurring to a beloved school. A seventh grader perched himself on his bicycle seat and looked sadly at the smoking debris. “At first everybody joked around - “oh boy, no school,” he said. “But deep down inside, I think they felt something. I went home and cried.” People lingered trying to absorb the loss personally and to the community. The loss was personal, since many of those present represented second or third generations who have attended Ferguson Middle School.

Watching the building burn, I imagined one crisis after another developing. What do we do first? How do I regain order? What questions will the media want to know and how will I answer? How will we continue this school year? How can I keep the staff together? As I watched the roof crumble in blackened debris, I could also see the spirits of teachers and other staff tumble and fall. Perhaps an opportunity to make a positive impact existed somewhere in this nightmarish event. Communications appeared to be the probable key to success or failure during this tragedy.

We were trained in what to do in a fire situation if we were occupying the building. We knew how to get out of the building. We didn't think about the devastating aftermath. The success of the recovery was due to the unselfish participation of students, parents and the community. A contingency plan would have reduced our anguish considerably and helped direct the efforts of everyone.

Written by Daryl K. Hall, Principal, Ferguson-Florissant School District.

This article adapted from Vol. 2, No. 4, p. 34.