Developing risk maps, heat maps and risk rankings based on subjective assessments of the severity of impact of potential future events and their likelihood of occurrence is common practice. These approaches provide an overall picture of the risks, seem simple and understandable enough to most people, are often the result of a systematic process and provide a rough profile of the organization’s risks.
Typical attributes of a risk map include: governing objectives drawn from a business strategy or plan that provides a context for the assessment, a common risk language that provides a perspective for understanding risk and predetermined criteria for conducting an assessment. While everyone agrees that an effective risk assessment should never end with just a list of risks, it is not unusual for traditional risk assessments to hit a wall, leaving decision makers with a list and little insight as to what to do next. In addition, there is the common complaint that risk assessments rarely surface an “a-ha!” that alters senior management’s view of the world.
CIO — As federal CIOs ramp up initiatives in hot IT areas like cloud computing and virtualization, they are looking to dramatically reduce the number of data centers the government maintains around the country, though officials acknowledge that that effort will take years to complete as agencies work through a litany of challenges.
For starters, the sheer size of the government's IT apparatus -- roughly $80 billion annually -- poses a challenge of a scale that dwarfs any single entity in the private sector.
In September, the Information Security Forum (ISF) released a report, “Managing BYOD Risk: Staying Ahead of Your Mobile Workforce,” which found that many companies, in their rush to institute some kind of BYOD security policy, often neglected or rushed risk management. Incomplete or ineffective policies in effect leave the company open to threats against its network. Instead, ISF encourages organizations to take an “info-centric” approach to BYOD policy.
I had the chance to speak with Steve Durbin, global vice president of ISF about the report.
Poremba: When talking about risk management in terms of BYOD, what exactly do you mean? Is it just good security practices or something more?
4th Annual Report helps CFO's, risk professionals navigate shifting landscape
AUSTIN, Texas – While many have been focused on the implementation of the Affordable Care Act (ACA), many technology and life science firms may not be focused on the insurance market and trends that are driving their purchasing decisions in their corporate risk management programs.
The fourth Annual TechAssure Association Benchmarking Report has been released. This report is one of the most comprehensive data collections across multiple insurance brokerage firms in the market. TechAssure Association collected, reviewed and analyzed data from their 21 insurance brokerage members. The fourth annual benchmarking report contains industry knowledge on corporate insurance pricing for the technology industry, updates on purchasing trends and limits of liability.
We found that technology and life science firms were seeking a small handful of key qualities when selecting an insurance broker and carrier for their corporate risk management needs and our report captures that information. Middle market technology companies have significant challenges and need better support in areas of insurance and risk management. It is common to have incomplete management teams and lack of resources. The services that TechAssure Association members provide help the technology industry develop a system for managing their unique exposures.
Other findings in the annual report provide unique insights into the insurance market responses for each technology sector. Technology sectors covered in the benchmarking report include Software, Internet, Gaming, Manufacturing and Semiconductor.
"Benchmarks are useful as one of many pieces of information that help you to understand your insurance and risk management program," said Julie Davis, Executive Director of TechAssure Association. "One of the strengths of the TechAssure Association Benchmarking report is that is collects data from a whole group of insurance brokerage and practices that focus on the complexities of the technology and life sciences industry."
The benchmarking report, conducted by TechAssure Association, is available from our brokerage members. For more information on TechAssure Association and our member firms, please visit the TechAssure Association website at www.techassure.com
Data is exploding. The variety of data being created by workers inside and outside of the workplace and the velocity at which that data is being shared makes corporate compliance officers sleep with one eye open, because uncontrolled data equals unknown risk, and the unknown is scary. Think about it – in addition to the terabytes of data lurking in companies’ disparate systems, organizations today are creating new content that is expected to drive 60 percent growth in enterprise data stores (Worldwide Big Data Technology and Services 2012-2015 Forecast, Mar 2012, IDC).
Most corporate compliance officers are concerned with the latter – newly created data is the shiny object grabbing attention. However, equal focus needs to be placed on legacy data (sometimes known as dark data), which is often unknown, unmanaged, and may be out of compliance with internal or external requirements. Many organizations today are dealing with information sprawl by throwing more storage at the problem – accepting the risk as a cost of doing business – or by simply ignoring it. None are ideal measures to protect the organization. In fact, 31 percent of organizations report that poor electronic recordkeeping is causing problems with regulators and auditors (Information Governance- Records, Risks, and Retention in the Litigation Age. AIIM 2013). Further, the cost of an individual data breach costs organizations an average $5.5 million (2011 Cost of Data Breach Study: Ponemon Institute 2011). There are also countless examples of fines, sanctions or adverse inference decisions being triggered by data being accidentally lost or mishandled.
To get a handle on dark data, it is first important to understand what it is. Dark data can take many forms, including both structured data (machine-created information that typically fits in rows and columns) and unstructured data (human-generated information that is much more difficult to search). It can also come in many formats and reside in many places, making it more difficult to access. It can be amassed simply because of our reliance on cheap storage or because of special circumstances like M&A. In virtually all cases, legacy data poses legal, regulatory and internal risk if it isn’t managed effectively.
Ian Kilpatrick considers the risks to businesses from the proliferation of wireless access points and discusses the benefits of deploying secure access points, which are directly linked to gateway security.
Wireless, mobility and BYOD are all part of an unstoppable wave, based on widespread consumer and remote worker usage. With the new faster wireless standard, 802.11ac, due to be approved in November this year, and with 4G continuing to grow, demand for fast wireless in the workplace will increase inexorably.
While this creates multiple opportunities, it also creates a great many challenges. If, for example, your existing wireless network is insecure, building on that base of sand is always going to fail.
Historically, for many organizations, both large and small, wireless was a tactical solution to a user-driven demand for laptop (and subsequently smartphone and tablet) mobility in the office.
Based on current disaster trends and economic values, the world is looking at a minimum cost in the region of 25 trillion dollars in disaster losses for the 21st century if there is no concerted response to climate change, one which puts the emphasis on practical measures to reduce disaster risk and exposure to future extreme events. This is according to a statement by the UN Office for Disaster Risk Reduction (UNISDR).
CIO — The announcement last month that cloud storage provider Nirvanix was closing up shop set off a wave of hysteria in the IT world and sparked speculation about the viability of cloud storage as an option for businesses.
The fear is understandable given the value of business data. However, with proper contingency planning and a solid backup/disaster recovery plan, such a closure doesn't have to be a big deal.
"This is not remarkable -- it has happened before. Just to name a few, EMC, Sun, Iron Mountain, a lot of 'big' companies have shut down solutions -- even cloud storage solutions- shuttered divisions, and ended the lifecycle of products with a huge install base," says Nicos Vekiarides, co-founder and CEO of Natick, Mass.-based cloud storage provider TwinStrata.
"What's different in this case is the quickness with which it happened, and I think there's certainly a lot of hysteria surrounding this announcement simply because it involves the cloud," Vekiarides says.
Whether or not rules are made to be broken, company policies are made to be reviewed. What was suitable for an organisation a few years ago may be out of date with requirements now. Paradoxically, this is an instance where business continuity management needs to introduce some discontinuity, to avoid the enterprise getting stuck in what could be an inefficient and even dangerous rut. A policy to use only one vendor’s IT equipment could stifle enthusiasm among employees who now want to work using their own devices. On the other hand, a policy of free access to company premises could now leave the company at risk of violating health and safety procedures. The first question is – where do you start?