Some MSPs may be understandably worried about taking on the responsibility of backing up large data sets. After all, just about any analyst you talk to is projecting data growth of 30% or more per year. So is it a wise move to add to existing service responsibilities by taking on an additional service such as cloud backup? The answer is a resounding yes.
Here are the facts. First, a major potential headache when it comes to cloud backup is the initial backup of a large data set from a new customer. That’s the one that could take a while. But it doesn’t take that long when you use an enterprise-class cloud backup solution.
A recent independent test by Mediatronics revealed Zetta.net could back up half a TB of data in less than 3 hours with a 1Gbit connection. After that, an incremental backup--using a 5% change rate for a worst-case scenario--took only an hour. In reality, 5% is actually an aggressive change rate. Surveys show that the rate of change in any organization is typically only about 2% of the entire data set. This opens the door for a larger total dataset in the cloud.
Many companies still lack the means or motivation to protect themselves from malicious insiders; but the effects of insider threats are simply too big to ignore. According to a report by the market research company Forrester, 46 percent of nearly 200 technology decision-makers reported internal incidents as the most common cause of the breaches they experienced in the past year. Out of those respondents, almost half said the breach stemmed from a malicious insider.
In this article TK Keanini looks at the practical steps that organizations can take to protect data and systems from insider threats.
The Ponemon Institute has released its annual Cost of Data Breach Study: Global Analysis, sponsored by IBM. According to the benchmark study of 350 companies spanning 11 countries, the average consolidated total cost of a data breach is $3.8 million1 representing a 23 percent increase since 2013.
The study also found that the average cost incurred for each lost or stolen record containing sensitive and confidential information increased six percent from a consolidated average of $145 to $154. Healthcare emerged as the industry with the highest cost per stolen record with the average cost for organizations reaching as high as $363. Additionally, retailers have seen their average cost per stolen record jump dramatically from $105 last year to $165 in this year's study.
"Based on our field research, we identified three major reasons why the cost keeps climbing," said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. First, cyber attacks are increasing both in frequency and the cost it requires to resolve these security incidents. Second, the financial consequences of losing customers in the aftermath of a breach are having a greater impact on the cost. Third, more companies are incurring higher costs in their forensic and investigative activities, assessments and crisis team management."
The first Cost of Data Breach study was conducted 10 years ago in the United States. Since then, the research has expanded to 11 countries. Ponemon Institute's Cost of Data Breach research is based on actual data of hundreds of indirect and direct cost categories collected at the company level using field-based research methods and an activity-based costing framework. This approach has been validated from the analysis of more than 1,600 companies that experienced a material data breach over the past 10 years in 11 countries.
I continue my exploration of actions you can take to improve your compliance program during an economic downturn with a review of what my colleague Jan Farley, the Chief Compliance Officer (CCO) at Dresser-Rand, called the ‘Desktop Risk Assessment’. Both the Department of Justice (DOJ) and Securities and Exchange Commission (SEC) make clear the need for a risk assessment to inform your compliance program. I believe that most, if not all CCOs and compliance practitioners understand this well articulated need. The FCPA Guidance could not have been clearer when it stated, “Assessment of risk is fundamental to developing a strong compliance program, and is another factor DOJ and SEC evaluate when assessing a company’s compliance program.” While many compliance practitioners have difficulty getting their collective arms about what is required for a risk assessment and then how precisely to use it; the FCPA Guidance makes clear there is no ‘one size fits all’ for about anything in an effective compliance program.
One type of risk assessment can consist of a full-blown, worldwide exercise, where teams of lawyers and fiscal consultants travel around the globe, interviewing and auditing. Of course this can be a notoriously expense exercise and if you are in Houston, the energy industry or any sector in the economic doldrums about now, this may be something you can even seek funding for at this time. Moreover, you may also be constrained by reduced compliance personnel so that you can not even perform a full-blown risk assessment with internal resources.
By conventional standards, business continuity cannot exceed one hundred percent. Business continuity of less than 100% is obviously possible, although measurements of just how much less may only be approximate. However, if everything is working properly, full business continuity has been achieved. Does it make sense to then talk about ‘fuller than full’ or a business continuity index that is more than 100%?
Most of the commentary regarding the cloud these days (mine included) focuses on the myriad ways in which abstract, distributed architectures can remake the enterprise as we know it.
We talk of software-defined data environments, hyperscale infrastructure and advanced Big Data and mobile application environments that will allow organizations to shed their rusty legacy environments in favor of a brave new world of computing.
The trouble is, most organizations don’t want that – at least, not right away.
The simple fact of the matter is that radical change is frightening to most people, and the typical CIO or data management executive is driven not by a desire to deploy the latest and greatest technology but to implement solutions that contribute to the bottom line.
(TNS) — The recent rioting and unrest in Baltimore will cost the city an estimated $20 million, officials said Tuesday.
The expenses — which go before the city’s spending board for approval Wednesday — include overtime for police and firefighters, damage to city-owned property and repaying other jurisdictions for police and other assistance.
Henry J. Raymond, Baltimore’s finance director, said the city can temporarily cover the costs from its rainy-day fund while seeking reimbursement for up to 75 percent from the Federal Emergency Management Agency.
“The city remains on strong financial footing,” Raymond said. “Hopefully, with the FEMA reimbursement, it will reduce the financial stress that we’re under. In terms of the city’s overall revenue structure, we’re on firm footing and we’ll move forward.”
According to a new study by the Ponemon Institute, sponsored by IBM, the average consolidated total cost of a data breach is $3.8 million, representing a 23% increase since 2013. The annual 'Cost of Data Breach Study' also found that the average cost incurred for each lost or stolen record containing sensitive and confidential information increased 6% from a consolidated average of $145 to $154.
"Based on our field research, we identified three major reasons why the cost keeps climbing," said Dr Larry Ponemon, chairman and founder, Ponemon Institute. "First, cyber attacks are increasing both in frequency and the cost it requires to resolve these security incidents. Second, the financial consequences of losing customers in the aftermath of a breach are having a greater impact on the cost. Third, more companies are incurring higher costs in their forensic and investigative activities, assessments and crisis team management."
Data breaches are a significant threat to organizations as highlighted in the Business Continuity Institute's latest Horizon Scan report which revealed that 82% of respondents to a survey were either concerned or extremely concerned about this threat materialising while 74% expressed the same level of concern to a data breach, making them the first and third greatest threats respectively.
Some of the highlights from the Ponemon Institute’s research include:
- Board level involvement and the purchase of insurance can reduce the cost of a data breach. The study looked at the positive consequences that can result when boards of directors take a more active role when an organization had a data breach. Board involvement reduces the cost by $5.50 per record. Insurance protection reduces the cost by $4.40 per record.
- Business continuity management plays an important role in reducing the cost of data breach. The research reveals that having business continuity management involved in the remediation of the breach can reduce the cost by an average of $7.10 per compromised record.
- The most costly breaches continue to occur in the US and Germany at $217 and $211 per compromised record respectively. India and Brazil still have the least expensive breaches at $56 and $78 respectively.
- The cost of data breach varies by industry. The average global cost of data breach per lost or stolen record is $154. However, if a healthcare organization has a breach, the average cost could be as high as $363, and in education the average cost could be as high as $300. The lowest cost per lost or stolen record is in transportation ($121) and public sector ($68).
- Hackers and criminal insiders cause the most data breaches. 47% of all breaches in this year's study were caused by malicious or criminal attacks. The average cost per record to resolve such an attack is $170. In contrast, system glitches cost $142 per record and human error or negligence is $137 per record. The US and Germany spend the most to resolve a malicious or criminal attack ($230 and $224 per record, respectively).
- Notification costs remain low, but costs associated with lost business steadily increase. Lost business costs are abnormal turnover of customers, increased customer acquisition activities, reputation losses and diminished good will. The average cost has increased from $1.23 million in 2013 to $1.57 million in 2015. Notification costs decreased from $190,000 to $170,000 since last year.
- Time to identify and contain a data breach affects the cost. The study shows the relationship between how quickly an organization can identify and contain data breach incidents and financial consequences. Malicious attacks can take an average of 256 days to identify while data breaches caused by human error take an average of 158 days to identify. As discussed earlier, malicious or criminal attacks are the most costly data breaches.
Hanover Attains Continuous Uptime with DataCore SANsymphony-V Deployed in a Synchronous Mirror Configuration – Ensuring Data Redundancy
HANOVER, Pa. – DataCore, a leader in software-defined storage, today announced that Hanover Hospital has realized continuous uptime with its high-availability software-defined storage and has significantly reduced the time and effort it takes to provision storage and systems.
“The biggest benefit Hanover Hospital has experienced from adopting DataCore has been true high availability due to the automatically synchronized virtual disks that are mirror protected and presented to different applications spanning our two on-campus datacenters,” stated Douglas Null, senior technical architect-MIS department, Hanover Hospital. “Each data center shares critical workloads – yet provides physical separation of storage and compute in the event of a localized data center outage. DataCore SANsymphony-V is our only storage solution and it delivers ‘no touch’ failover and failback operation. It delivers a fully automated process. Other vendor solutions are replicated as active/passive, need human intervention or scripts, or require other point products or special configurations to bring the passive site online.”
Within healthcare, IT is under enormous pressure to increase storage capacity, improve resiliency and accelerate performance – all while managing costs. (See DataCore’s infographic: Healthcare IT Storage Challenges) Hanover Hospital is one of more than 1,000 healthcare customers that have trusted DataCore to virtualize its storage infrastructure – thereby making its storage software-defined.
Overcoming Downtime, Data Growth and Slow Performance
Hanover reports that with DataCore SANsymphony-V deployed in a synchronous mirror configuration, it has realized continuous uptime through with its high-availability storage and has significantly reduced the time it takes to provision storage and systems. According to Null, “DataCore keeps both our users and patients happier because of high systems’ availability. Moreover, with DataCore we have been able to simplify management and reduce the total cost of ownership of the entire storage infrastructure.”
Hanover originally started a number of years back with a single DataCore installation in one data center. Over the years, the hospital added synchronous mirroring that stretched storage availability between its two on-campus data centers. DataCore SANsymphony-V now serves as a unified storage services platform across the entire multi-site infrastructure. In particular, it is relied upon extensively for various mission-critical, enterprise and clinical applications. Examples of these include the hospital’s Healthcare BI reporting platforms, clinical middleware, medical dictation and transcription services, and Citrix XenApp, among others.
Null adds, “We get very impressive performance and bandwidth throughput for the amount of VM servers and applications we are hosting on our environment. Plus, we have improved storage utilization since we are able to over-provision storage by about sixty percent, meaning we are more efficient in our ability to meet the growth and cost demands for more capacity.”
Furthermore, the IT team wanted to deploy a Voice over IP (VOIP) telephony application and wanted the same “always up, always on” capability. “After doing some research, what we came up with was to deploy DataCore – but in this instance use the product in another way altogether. In this case, Hanover deployed DataCore Virtual SAN, which used virtualized storage controllers inside of a VMware ESX host,” stated Null. “That solution had far fewer requirements than Virtual SAN from VMware.”
Hanover Hospital is an independent, not-for-profit community hospital and part of Hanover HealthCare PLUS network of services. The hospital is located in Hanover, Pennsylvania. The hospital has approximately 1,400 staff and 93 beds across 15 buildings. Hanover manages 6,000 patient visits, 30,000 ER visits, 190,000 outpatient visits, 600,000 lab tests, 90,000 imaging scans, and over 600 births.
Hanover Hospital – Addressing the Top 3 Storage Challenges in Healthcare
To learn more, please view our recorded webinar featuring Hanover Hospital “Addressing the Top Three Storage Challenges in Healthcare”. It highlights the challenges faced by healthcare IT departments such as maintaining 24x7x365 operations, managing explosive data growth and ensuring the highest performance from critical applications.
In the webinar, Hanover Hospital’s Douglas Null, senior technical architect-MIS department, discusses his firsthand experience and best practices using DataCore’s software-defined storage.
A full case study on the deployment at Hanover Hospital is also available:
About DataCore Software
DataCore is a leader in software-defined storage. The company’s storage virtualization and virtual SAN solutions empower organizations to seamlessly manage and scale their data storage architectures, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware vendors. Backed by 10,000 customer sites around the world, DataCore’s adaptive and self-learning and healing technology takes the pain out of manual processes and helps deliver on the promise of the new software defined data center through its hardware agnostic architecture.
Visit http://www.datacore.com or call (877) 780-5111 for more information.
Historic City Transforms Interdepartmental Business Processes – Reduces Latency, Simplifies File Sharing, Increases Flexibility and Reduces Overall Storage Costs
NORCROSS, Ga. – StorTrends® today announced that its high performance storage area network (SAN) storage appliances have been credited with "revolutionizing" the IT infrastructure of the City of Napoleon. Based 35 miles from Toledo, Ohio, and home to almost 9,000 residents, the City of Napoleon implemented the StorTrends SAN solution to help modernize its IT infrastructure while keeping costs down.
"Adding SAN to the City of Napoleon's IT environment has completely transformed how business is done interdepartmentally," said Dan Wachtman, MIS Administrator, City of Napoleon. "I would often lose sleep at night worrying about the data within the City of Napoleon's IT environment. But now, I sleep comfortably knowing our data is fully protected thanks to the StorTrends SANs with their enterprise class snapshots and replication for disaster recovery."
The IT department for the City of Napoleon is responsible for supporting the 19 departments that make up the city's government infrastructure. Its combination Windows and LINUX IT environment includes a myriad of physical servers from various vendors, as well as 25 HP and Citrix Virtual Servers, and Microsoft SQL Server databases.
Prior to deploying the StorTrends solution, the City of Napoleon outsourced most of its larger IT jobs. But as costs and demands continued to escalate, Wachtman, who has been involved in the city's IT division for 15 years, recognized the need for a more efficient and cost-effective solution.
Some of the benefits that the city has experienced after making the transition to SAN include reduced latency, simplified file sharing, increased flexibility, easier storage deployment, and reduced overall storage costs. The City of Napoleon found that specific features of the StorTrends solution-such as snapshots and disaster recovery (DR)-were particularly helpful.
Wachtman added that the StorTrends support plan, StorAID, helped make the jump to SAN less challenging. "The technical support that we experienced was second to none-when we needed them they were there," said Wachtman. "After experiencing this high level of service, I would tell anyone that there is no reason to buy any other competitive storage solution."
"For organizations and institutions, like the City of Napoleon, StorTrends offers a multitude of configurations in all-flash, hybrid flash and spinning disk arrays to meet the requirements and budgets of all IT environments," said Justin Bagby, Director of the StorTrends Division at American Megatrends, Inc. (AMI). "IT professionals that are interested in a price quote on the StorTrends SAN Arrays should check out our onlinePrice Quote Generatorfor a hassle-free price quote."
To read more about the City of Napoleon, and other StorTrends customers, please visit: www.stortrends.com/resourves/customer-stories.
StorTrends® from American Megatrends (AMI) isPerformance Storage with Proven Value. StorTrends SAN and NAS storage appliances are installed worldwide and trusted by companies and institutions in a wide range of industries including education, energy, finance, state and local government, healthcare, manufacturing, marketing, retail, R&D and many more. StorTrends meets the challenges and demands of today's business environments by offering a wide variety of solutions from all-flash storage, hybrid storage to spinning disk solutions. StorTrends is backed by 1,100+ customer installations, 100+ storage patents and nearly 30 Years of IT leadership from a company that millions of people trust on a daily basis, American Megatrends, Inc. For further information, please visit: http://www.stortrends.com.