We are in the midst of experiencing one of the most monumental shifts in the information technology age to date—an evolution from self-managed IT to IT as a service. With a public cloud services market estimated by Gartner to exceed $244 billion by 2017, service providers looking to capitalize on this tremendous opportunity must be focused on rapid time to market and deliver exceptional managed services to their customers.
However, like most of us, service providers of all types and sizes are being challenged to do more with less, to enable faster R&D cycles, and to accelerate customer acquisition growth while reducing overall spend. It is for these reasons that many MSPs have been looking to leverage VMware’s as-a-service offerings: When it makes sense for their business, partners can opt to buy--as a complement to what they’ve built--ready-to-run infrastructure and desktop services, and focus on delivering managed services on top.
When Anthem, the second largest insurance provider in the United States, revealed recently that its records had been compromised by hackers — resulting in the possible leaking of personal data of more than 80 million present and former customers — the incident became a much-needed wake-up call for the health care industry.
Unfortunately, Anthem is not the first company to experience a major data breach in the past 18 months. In 2014 alone, customer data, credit card information and intellectual property were stolen from Target, Home Depot, JPMorgan Chase, Sony Pictures and many others. What recent history has taught us is that hackers are becoming more sophisticated, attacks are becoming more malicious and no industry or organization is invulnerable.
The public has moved on from asking, “How did this happen?” to asking, “Why does this keep happening?” The attention on privacy rights coupled with the growing costs of major data breaches are elevating the issue of managing the digital enterprise to the board level.
By Gary Hinson and Dejan Kosutic
Most business continuity experts from an IT background are primarily, if not exclusively, concerned with establishing the ability to recover failed IT services after a serious incident or disaster. While disaster recovery is a necessary part of business continuity, this article promotes the strategic business value of resilience: a more proactive and holistic approach for preparing not only IT services, but also other business processes before an incident in order that an organization will survive incidents that would otherwise have taken it down, and so keep the business operating in some form during and following an incident.
According to the BSI Standard 100-4 (2009), “Business continuity management consists of a planned and organized procedure for sustainably increasing the resilience of (time-)critical business processes of an organization, reacting appropriately to events resulting in damages, and enabling the resumption of business activities as quickly as possible. The goal of business continuity management is to ensure that important business processes are only interrupted temporarily or not interrupted at all, even in critical situations, and to ensure the economic existence of the organization even after incurring serious damage.”
Is business continuity important enough to invest time, effort, and money into achieving it? Given that the alternative implies accepting the risk that the business will quite likely fold in a crisis, few in management would seriously argue against business continuity, but that still leaves the questions of how much to invest, and how to invest wisely. These are strategic issues: business continuity is a strategic concern.
A new BSI Global Supply Chain Intelligence report reveals that there were $33 billion of business losses due to natural disasters and $23 billion of cargo theft in 2014. Rapid economic growth in emerging economies, workforce disruptions, political instability and the Ebola outbreak in West Africa led to a rise in business losses. Within Europe, trade interruption due to an array of strikes throughout Europe caused $1.5 billion of direct losses to business.
The report is based on data from BSI's Supply Chain Risk Exposure Evaluation Network (SCREEN) which provides continuous evaluation across 20 proprietary risk factors and 203 countries. The 2014 data reveals a clear picture of the changing global threat landscape and how this varies by country, continent and industry sector.
Shereen Abuzobaa, commercial director, BSI Supply Chain Solutions commented: "Companies are facing an increasingly wide range of challenges to their supply chain, from human rights issues to natural disasters. Such complexity creates black holes of risk for organizations, both directly affecting the bottom line but perhaps more seriously, hidden supply chain risk, damaging a company's hard-earned reputation."
The report warns companies, particularly those in the apparel trade to scrutinise their global supply chain closely. Countries such as Haiti are noted by the report as having 29 percent of all children between the ages of five and 14 working in slave-like conditions. This compares to 5.8 percent in the Dominican Republic and 8.4 percent in Jamaica. In recognition of this growing threat of human rights and environmental violations, governments in Western Europe attempted to push supply chains towards greater compliance with social and ethical norms, increasing the regulatory burden on organizations.
Port congestion and strikes continued to severely affect business continuity across Asia Pacific, the west coast of the United States and Germany throughout 2014. Limited container storage space resulted in cargo discharge times of up to a week, increasing operational costs for companies shipping through Hong Kong by nearly $1 million per month. General strikes across Belgium caused $1 billion of direct losses to business, while airline strikes in France and Germany cost $300 million and $198 million respectively.
BSI's research shows that in 2014 cargo shipments were heavily impacted by a rapid growth in supply chain terrorism. Terrorist organizations such as ISIS imposed systematic controls across Syrian and Iraqi territory, imposing costs of as much as $3 million in revenue per day through extortion and supply chain control schemes. Europe, the Middle East and Africa, saw cargo disruption rates stay stable, in stark contrast to the Americas where they saw an increase in disruptions driven by increased illegal drug introduction, spikes in cargo thefts, terrorist plots and incidents. The Asia-Pacific region saw a rise in disruption, both from a growing methamphetamine trade, and the long term challenges of counterfeit production and piracy.
While the report highlights cargo theft as a growing risk, it is still outweighed by the economic impact of natural disasters. 2014's top four natural disasters caused a collective $32.8 billion of damage to businesses, with flooding across Pakistan and India making up a third of this figure. Three quarters (75 percent) of the top exporters across the Asia-Pacific region are rated high or severe for natural disaster risk.
To purchase a copy of BSI's 2014 SCREEN Global Intelligence Report, please click here.
Netwrix 2015 State of IT Changes Survey reveals that nearly 70% of organizations continue to make undocumented changes and only 50% audit their IT infrastructures
IRVINE, Calif. – Netwrix Corporation, the #1 provider of change and configuration auditing software, today announced the results of its 2015 State of IT Changes Survey. The research of more than 700 IT professionals across over 40 industries found that 70% of companies forget about documenting changes, up from 57% last year. Most surprisingly, the number of large enterprises that make undocumented changes has increased by 20% to 66%.
Undocumented changes pose a hidden threat to business continuity and the integrity of sensitive data. The survey shows that 67% of companies suffer from service downtime due to unauthorized or incorrect changes to system configurations, while the worst offenders are again enterprises in 73% of cases.
Security-wise, the overwhelming majority of organizations claim to have never made a change that turned out to be the root cause of a breach. However, given that the majority of companies make undocumented changes and only half of them have auditing processes in place - instead relying on looking through native logs manually - their ability to prove the security of their systems is questionable. What seems to be true is that many organizations remain in the dark about what is going on across their IT infrastructures and are not able to detect a security violation until a data breach is officially revealed.
Despite the fact that companies still have shortcomings in their change management policies, the overall results of 2015 show a positive trend. More organizations have changed their approach to changes and have made some effort to establish auditing processes to achieve visibility into their IT infrastructures. The key survey findings show that of the respondents:
- 80% of organizations continue to claim they document changes; however, the number of companies that make undocumented changes has grown throughout the year and reached 70%. The frequency of those changes has also increased.
- 58% of small companies have started to track changes despite the lack of change management controls, against 30% last year.
- Change auditing technology continues to capture the market, as 52% of organizations have established change auditing controls, compared to 38% last year. Today, 75% of enterprises (52% in 2014) have established change auditing processes to monitor their IT infrastructures.
- Organizations opt for several methods of change auditing at once. 60% of SMBs traditionally choose manual monitoring of native logs, whereas 65% of enterprises deploy automated auditing solutions.
- Due to established change management controls, more thorough documentation and automated auditing processes, the number of enterprises who managed to find which changes were a root cause of security incidents has doubled since 2014, from 17% in 2014 to 33% in 2015.
"As with years past, errors made by internal staff, especially system administrators, who were the prime actors in over 60% of incidents, represent a significant volume of breaches and records," stated the Verizon 2015 Data Breach Investigations Report. "Understand where goofs, gaffes, fat fingers, etc., can affect sensitive data. Track how often incidents related to human error occur. Measure effectiveness of current and future controls, and establish an acceptable level of risk you are willing to live with, because human fallacy is with us to stay."
"Human factor is the key to informational security and its pain point at the same time," said Alex Vovk, CEO and co-founder of Netwrix. "No matter how advanced the security policy is, people still make mistakes and from time to time misbehave, putting overall system security and business continuity at risk. In this case automated auditing processes can help companies keep their IT systems under control and make sure that any deliberate or accidental changes will be detected and addressed properly to eliminate the risk of a data breach."
The key findings are summarized in the infographics.
To download a complete copy of the “Netwrix 2015 State of IT Changes Survey” report, please visit http://www.netwrix.com/go/survey2015.
Meet the Netwrix team, and find out more about visibility into IT infrastructure during the RSA Conference at booth #2817, in San Francisco, April 20-24, 2015, and Microsoft Ignite Conference at booth #239, in Chicago, May 4-8, 2015.
About Netwrix Corporation
Netwrix Corporation, the #1 provider of change and configuration auditing solutions, delivers complete visibility into who did what, when and where across the entire IT infrastructure. This strengthens security, streamlines compliance and optimizes operations. Founded in 2006, Netwrix is named to the Inc. 5000 list and Deloitte Technology Fast 500. Netwrix software is used by 160,000 users worldwide. For more information, visit www.netwrix.com
At the current pace, demand for water will increase by 55% globally by 2050, projects the Organisation for Economic Cooperation and Development (OECD).
The increase will mainly come from manufacturing (+400%), electricity (+140%) and domestic use (+130%). In fact, the World Bank has cited a 40% global shortfall between forecast demand and available supply of water by 2030. Add in competition from agriculture to feed growing populations, and the gap between supply and demand results in very challenging consequences.
This means that water has moved to the top of the business risk agenda - according to the World Economic Forum’s Global Risks Report 2015 - and is a major concern for society as a whole. With business sharing water with communities, industry, farmers and other users, securing the right quality and quantity of water at the right time is set to become a serious production and reputational issue.
PwC’s ‘Collaboration: Preserving water through partnering that works’ report, which explores the risks for business associated with water and how to collaborate with stakeholders to achieve a common goal – to share water successfully.
The risks for business from having too much or too little water, water that's too dirty or too expensive are increasing. Whether it’s used to cool, heat and/or clean as well as an ingredient, it's also a critical factor in the supply chain; it and can cause disruption in storage, damaging stock and imposing detrimental impacts on distribution. Even Financial Services is touched by water, investing in and insuring a business with an unknown or unquantified exposure to water risk. Water permeates right across the business world.
Malcolm Preston, Global Sustainability Leader at PwC says: “Continued effective water management is becoming more complex and costly for business. Identifying and managing the potential material risks both in direct operations and in the supply chain - for example, pollution, flooding, irregular or reduced supply, governance, regulation, climate change, disaster threat, reputational issues etc. - is an important step to managing the bottom line and avoiding sudden costs.
“Ultimately, securing water will come down to effective collaboration with other users in the water basin and when stakeholders come together, even with the best intentions to work together, they often have hugely differing perspectives and demands. PwC recognises the complexities of collaboration and are able to offer an independent perspective.”
PwC firms help organisations and individuals create the value they’re looking for. We’re a network of firms in 157 countries with more than 195,000 people who are committed to delivering quality in assurance, tax and advisory services. Find out more and tell us what matters to you by visiting us at www.pwc.com.
PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details.
Proofpoint, Inc., has released the results of its annual study that details the ways attackers exploit end-users' psychology to circumvent IT security. The Human Factor Report 2015 reveals that last year was the year attackers “went corporate” by changing their tactics to focus on businesses rather than consumers, exploiting middle management overload of information sharing, and trading off attack volume for sophistication.
The Proofpoint findings reiterate how human actions, not simply system or software vulnerabilities, has significant implications on enterprise security: and what protection is necessary in a “world where everyone clicks.”
Key findings from The Human Factor Report 2015 include:
- Every organization clicks. On average, users click one of every 25 malicious messages delivered. No organization observed was able to eliminate clicking on malicious links.
- Middle management is a bigger target. Representing a marked change from 2013 when managers were less frequently targeted by malicious emails, in 2014 managers effectively doubled their click rates compared to the previous year. Additionally, managers and staff clicked on links in malicious messages two times more frequently than executives.
- Sales, finance and procurement are the worst offenders when it came to clicking links in malicious messages, clicking on links in malicious messages 50-80 percent more frequently than the average departmental click rate.
- Clicks happen fast. Organizations no longer have weeks or even days to find and stop malicious emails because attackers are luring two-out-of-three end users into clicking on the first day, and by the end of the first week, 96 percent of all clicks have occurred. In 2013, only 39 percent of emails were clicked in the first 24 hours; however, in 2014 that number increased to 66 percent.
- Attacks are occurring mostly during business hours. The majority of malicious messages are delivered during business hours, peaking on Tuesday and Thursday mornings. Tuesday is the most active day for clicking, with 17 percent more clicks than the other weekdays.
- Users learn, but attackers adapt faster than users can learn. The use of social media invitation lures, which were the most popular and effective email lures in 2013, decreased 94 percent in 2014. Email lures that employ attachments rather than URLs, such as message notification and corporate financial alerts, increased significantly as a vector. During select days in 2014, Proofpoint saw a 1,000 percent increase in messages with malicious attachments over the normal volume. The most popular email lures in 2014 included: e-fax and voicemails notifications, and corporate and personal financial alerts.
Proofpoint’s report is based on data gathered from its suite of advanced threat protection products that are live within customer environments. To obtain a copy of Proofpoint's Human Factor Report, please visit www.proofpoint.com/humanfactor
As an emergency manager, one of the easiest questions to answer is: Why do we do what we do? Thoughts of preventing loss of life and protecting property for our families, neighbors and all members of our community and nation quickly spring to mind. A frequent follow-on question can be more complex: That sounds important, how do you make sure you get it done right?
As we answer this next question, we may recall the problems we solved: the time we found a flaw in our response plan that we quickly fixed, or the moments in the Emergency Operations Center when we relied on our team and our training to make the right decisions. Indeed, it is our ability to problem-solve effectively that keeps emergency management so dynamic. Whether we work in preparedness, mitigation, response or recovery, as we identify solutions to address the worst-of-the-worst that could happen (or has happened) to our communities, we act as agents of dynamic change.
This dynamism goes all the way to our core, as even our foundational structure and methodology have evolved significantly since the turn of the century. In recent years we have redefined our relationship with homeland security; we have learned our place under one National Incident Management System; the list could go on. This ongoing evolution, empowered by our willingness to identify our weaknesses and strengthen them, is a core reason why our community is so strong.
(TNS) — Nearly half of all Americans — 150 million people — are threatened by possibly damaging shaking from earthquakes, scientists said Wednesday at a meeting of the Seismological Society of America.
That figure, from all 50 states and Puerto Rico, is a sharp jump from the figure in 1994, when the Federal Emergency Management Agency estimated just 75 million Americans in 39 states were at risk from earthquakes.
The authors of the study, which included the U.S. Geological Survey, said the sharp increase in exposure to quake damage was largely because of population increases in areas prone to earthquakes, particularly California, said William Leith, a coauthor and senior science advisor for earthquake and geologic hazards at the U.S. Geological Survey.
(TNS) — Using some of its strongest language to date, the Oklahoma Geological Survey said Tuesday the state's ongoing earthquake swarm is "very unlikely to represent a naturally occurring process."
The state survey said the suspected source of triggered earthquakes is the use of wastewater disposal wells that dump large amounts of water produced along with oil production.
"The observed seismicity of greatest concentration, namely in central and north-central Oklahoma, can be observed to follow the oil and gas plays characterized by large amounts of produced water," the report stated. "Seismicity rates are observed to increase after a time-delay as injection volumes increase within these plays. In north central and north-central Oklahoma, this time-delay can be weeks to a year or more."