Customer contact organizations are at the heart of business continuity and disaster recovery strategies, as they are the go-to resource for customers in times of disaster. A new report from Frost & Sullivan looks at these important organizational assets and explores the specific business continuity and disaster recovery relating to them.
"The importance of information during times of such distress has made a strong case for advanced and multilayered business continuity and disaster recovery methods," said Frost & Sullivan Information and Communication Technologies Industry Analyst Brendan Read. "This enables contact centers to plan, respond and recover from natural and man-made disasters."
Customer contact organizations face two challenges when devising and implementing effective business continuity and disaster recovery programs. The first is balancing the potential risks and losses from adversity and the investments needed for putting in place effective BC/DR solutions. The second challenge pertains to enterprises' lack of motivation to deploy these solutions due to the unpredictability of these events.
Jim Burtles, Hon FBCI, provides an overview of the Emergency Evacuation Planning Lifecycle which he has developed and explained in a new book.
For the past 12 years I have been emotionally attached and intellectually concerned with the events of 9/11 and as a business continuity specialist I have struggled with the problems associated with getting people to safety before their workplace becomes a prison or a tomb.
The end-result of many years of research, experiment and training is a robust and reliable structured approach to ensuring that people are best prepared to reach safety whenever danger looms. The very latest Business Continuity Lifecycle and its underlying principles have been adapted and applied to create a new or parallel discipline. Adherence to a clearly defined six-stage emergency evacuation planning (EEP) protocol raises the subject matter from the realms of an ad-hoc adventure to that of a disciplined practice with predictable and defendable results.
CIO — FMW Fasteners, a distributor of down-to-earth items such as nuts and bolts, now sees its future in the cloud.
The Houston-based company grew up much like its fastener industry peers, running its business systems in-house and selling through inside and outside sales reps. FMW, however, has evolved to a new model: Running its operations in the cloud. The company deployed NetSuite enterprise resource planning (ERP) software, along with the cloud vendor's customer relationship management (CRM) and ecommerce offerings.
Cloud adoption has dramatically changed how FMW conducts business. The cloud, says FMW Sales manager Steve Baker, eliminates the headache of managing on-premises IT, improves business agility and accommodates a high-growth track. "It has completely transformed the business and what we were able to do and our sense of the possibilities of what we could get done."
Agile techniques have become popular over the last few years. They have their roots in software development projects. Unhappy with ‘monolithic’ projects that exceeded both time and money budgets, project teams looked for a better way to deliver useful end-results to software users – and that also kept up with changing requirements into the bargain. With agile methodologies, software is produced and released in short cycles, typically two to four weeks. Testing is done in parallel so as to avoid delaying releases and users are constantly invited to use the current release, and comment on what they find useful or not. Can such an approach be applied to business continuity?
We say it all the time: Data governance should be driven by the business. But let’s face it: IT knows the technology and most of the technology requires heavy IT involvement.
So what does that even mean when you’re talking about something as technology-focused as master data management? And how can CIOs convince the business that data governance is its responsibility?
You may know that White focuses on master data at IT research firm Gartner, but what is less well-known is that White is a supply chain management expert, as well. And like everything else in the world, supply chains are becoming more data-driven. That’s putting pressure on supply chain leaders to deal with their data problems, White explains in the article.
The constant theme in data center circles these days is change. Virtualization, the cloud, solid-state storage—all are driving traditional data infrastructure in new and exotic directions. Most observers, however, tend to view this change in terms of the present, or even the past—that is, how will this new technology solve the problems I’m dealing with today?
It’s not an unreasonable question to ask. In the end, it falls a little short, though, because the true benefit of new technology is usually not in its ability to fix the problems of the past but to open up entirely new benefits for the future. The first ones to envision that future and capitalize on it will become the titans of tomorrow’s data industry.
Gartner hit on this notion recently in its latest evaluation of the cloud industry. While noting that most organizations still need to put cloud infrastructure into motion, analyst Gregor Petri cautioned that the money being spent today to upgrade legacy data centers will be poorly spent if the enterprise maintains a data center-centric view in the new cloud/services era. In other words, why limit the cloud to a mere cost-savings function when it offers so much promise as a revenue and opportunity builder?
By Charlie Maclean-Bristol, MBCI FEPS
The news during the past week seems to have been dominated by the possibility of military intervention in Syria. However, an item which has been pushed down the order of news, is the ‘Rim Fire’ in the north-western part of Yosemite national park. Although wild fires seem reasonably common in the USA, this one caught my eye as the ash from the fire threatened to pollute the Hetch Hetchy reservoir, which provides water to 2.6 million people and provides 85 percent of the water to the city of San Francisco.
Having worked at Anglian Water in the UK for seven years, first as Emergency Planning Manager and then as as Head of Security and Business Continuity, I always take a keen interest in any emergency involving water. The contamination of water by ash is an incident that is new to me and I wonder how serious it is? Even with my limited knowledge of clean water purification, I know that water plants are pretty good at taking out any possible contaminants. In fact, due to the massive dilution associated with a large reservoir and the quality of the treatment, unless you dump lorry loads of contaminants into a reservoir it is actually very hard to pollute an entire reservoir.
Where pollutants have entered the water system it is usually at the processing stage. A classic case of this was the leaking of diesel from a generator into the water supply of part of Glasgow in 1997, which lead Scottish Water to issue a ‘do not drink’ notice to 50,000 people. This was not caused by a spill into one of the reservoirs that fed the city but at the actual water treatment plant.
The first step in the Risk Management and Own Risk and Solvency Assessment Model Act (RMORSA) implementation, Risk Culture and Governance, lays the groundwork and defines roles for your risk management function. The second step, Risk Identification and Prioritization, defines an ongoing risk intelligence process that equips an organization with the data needed for risk based decision making.
The engine behind this process – the enterprise risk assessment – isn’t a new concept, but organizations are finding that the traditional, intuitive ideas for how to conduct risk assessments are inadequate. Too often, risk managers are interviewing process owners and collecting huge quantities of data, only to find that their top 10 risks are entirely objective and lack any actionable component. And what good is a top 10 risk if you can’t answer the inevitable question; what are you going to do about it?
I haven’t looked at the study, itself, only Bulldog Reporter’s story on it, but my reaction was first, “well, duh,” and second, was it really Facebook? Now I completely support the use of Facebook in a crisis. Coca Cola, for example, has 72 million likes on its Facebook page with over 1 million talking about it. Other brands sport similar astounding numbers. So, if Coke is in a crisis, why wouldn’t they be talking to those people who have already connected with them in this way?
But, my question is the study and the conclusion they come to. The study involved created two fake universities, showing students news stories about the crisis these universities were in and then judging student reaction. Then the researchers showed the students fake Facebook posts from the fake universities “which gave additional information and messages directly from the universities.”
Businesses are losing the battle against state-sponsored cyber attacks and things are unlikely to improve in the short term, according to a survey of senior IT security professionals.
While nearly 63% of respondents think a state-sponsored attacker will attempt to breach their organisation in the next six months, 74% said they were not confident that their own corporate network had not already been breached by a foreign state-sponsored hacker.
Most respondents said they believe that the hacking landscape is going to get worse over time.