Some MSPs may be understandably worried about taking on the responsibility of backing up large data sets. After all, just about any analyst you talk to is projecting data growth of 30% or more per year. So is it a wise move to add to existing service responsibilities by taking on an additional service such as cloud backup? The answer is a resounding yes.
Here are the facts. First, a major potential headache when it comes to cloud backup is the initial backup of a large data set from a new customer. That’s the one that could take a while. But it doesn’t take that long when you use an enterprise-class cloud backup solution.
A recent independent test by Mediatronics revealed Zetta.net could back up half a TB of data in less than 3 hours with a 1Gbit connection. After that, an incremental backup--using a 5% change rate for a worst-case scenario--took only an hour. In reality, 5% is actually an aggressive change rate. Surveys show that the rate of change in any organization is typically only about 2% of the entire data set. This opens the door for a larger total dataset in the cloud.
Many companies still lack the means or motivation to protect themselves from malicious insiders; but the effects of insider threats are simply too big to ignore. According to a report by the market research company Forrester, 46 percent of nearly 200 technology decision-makers reported internal incidents as the most common cause of the breaches they experienced in the past year. Out of those respondents, almost half said the breach stemmed from a malicious insider.
In this article TK Keanini looks at the practical steps that organizations can take to protect data and systems from insider threats.
The Ponemon Institute has released its annual Cost of Data Breach Study: Global Analysis, sponsored by IBM. According to the benchmark study of 350 companies spanning 11 countries, the average consolidated total cost of a data breach is $3.8 million1 representing a 23 percent increase since 2013.
The study also found that the average cost incurred for each lost or stolen record containing sensitive and confidential information increased six percent from a consolidated average of $145 to $154. Healthcare emerged as the industry with the highest cost per stolen record with the average cost for organizations reaching as high as $363. Additionally, retailers have seen their average cost per stolen record jump dramatically from $105 last year to $165 in this year's study.
"Based on our field research, we identified three major reasons why the cost keeps climbing," said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. First, cyber attacks are increasing both in frequency and the cost it requires to resolve these security incidents. Second, the financial consequences of losing customers in the aftermath of a breach are having a greater impact on the cost. Third, more companies are incurring higher costs in their forensic and investigative activities, assessments and crisis team management."
The first Cost of Data Breach study was conducted 10 years ago in the United States. Since then, the research has expanded to 11 countries. Ponemon Institute's Cost of Data Breach research is based on actual data of hundreds of indirect and direct cost categories collected at the company level using field-based research methods and an activity-based costing framework. This approach has been validated from the analysis of more than 1,600 companies that experienced a material data breach over the past 10 years in 11 countries.
I continue my exploration of actions you can take to improve your compliance program during an economic downturn with a review of what my colleague Jan Farley, the Chief Compliance Officer (CCO) at Dresser-Rand, called the ‘Desktop Risk Assessment’. Both the Department of Justice (DOJ) and Securities and Exchange Commission (SEC) make clear the need for a risk assessment to inform your compliance program. I believe that most, if not all CCOs and compliance practitioners understand this well articulated need. The FCPA Guidance could not have been clearer when it stated, “Assessment of risk is fundamental to developing a strong compliance program, and is another factor DOJ and SEC evaluate when assessing a company’s compliance program.” While many compliance practitioners have difficulty getting their collective arms about what is required for a risk assessment and then how precisely to use it; the FCPA Guidance makes clear there is no ‘one size fits all’ for about anything in an effective compliance program.
One type of risk assessment can consist of a full-blown, worldwide exercise, where teams of lawyers and fiscal consultants travel around the globe, interviewing and auditing. Of course this can be a notoriously expense exercise and if you are in Houston, the energy industry or any sector in the economic doldrums about now, this may be something you can even seek funding for at this time. Moreover, you may also be constrained by reduced compliance personnel so that you can not even perform a full-blown risk assessment with internal resources.
By conventional standards, business continuity cannot exceed one hundred percent. Business continuity of less than 100% is obviously possible, although measurements of just how much less may only be approximate. However, if everything is working properly, full business continuity has been achieved. Does it make sense to then talk about ‘fuller than full’ or a business continuity index that is more than 100%?
Most of the commentary regarding the cloud these days (mine included) focuses on the myriad ways in which abstract, distributed architectures can remake the enterprise as we know it.
We talk of software-defined data environments, hyperscale infrastructure and advanced Big Data and mobile application environments that will allow organizations to shed their rusty legacy environments in favor of a brave new world of computing.
The trouble is, most organizations don’t want that – at least, not right away.
The simple fact of the matter is that radical change is frightening to most people, and the typical CIO or data management executive is driven not by a desire to deploy the latest and greatest technology but to implement solutions that contribute to the bottom line.
(TNS) — The recent rioting and unrest in Baltimore will cost the city an estimated $20 million, officials said Tuesday.
The expenses — which go before the city’s spending board for approval Wednesday — include overtime for police and firefighters, damage to city-owned property and repaying other jurisdictions for police and other assistance.
Henry J. Raymond, Baltimore’s finance director, said the city can temporarily cover the costs from its rainy-day fund while seeking reimbursement for up to 75 percent from the Federal Emergency Management Agency.
“The city remains on strong financial footing,” Raymond said. “Hopefully, with the FEMA reimbursement, it will reduce the financial stress that we’re under. In terms of the city’s overall revenue structure, we’re on firm footing and we’ll move forward.”