Netwrix Sums Up the Statistics on Data Breaches in 2015 and Identifies Types of Cybercrime That Will Potentially Affect U.S. Companies in the Coming Year
IRVINE, Calif. – According to the Chronology of Data Breaches by Privacy Rights Clearinghouse, a nonprofit organization that aims to raise awareness about how technology affects personal privacy, the number of companies that experienced a data breach in 2015 reduced by 44% since 2014. At first glance, it would seem that predictions about 2015 being the "Year of the Super-Mega Breach" did not come true. But a closer look reveals that hackers stopped wasting time on trifles and focused on jackpots instead. Fewer incidents resulted in over 197 million compromised records, compared to 295 million in 2014.
Furthermore, security experts still cannot assess the actual damage for 68% of data breaches, so obviously the number of compromised accounts is going to increase. Apparently, cyberattacks have become more targeted and sophisticated than ever and now pose an equal threat to all companies that handle any type of sensitive data, including financial and personal.
Netwrix Corporation, a provider of IT auditing software that delivers complete visibility into IT infrastructure changes and data access, has picked out five patterns of cybercrime that were the most common root cause of security incidents in 2015. The data breaches listed below are ranked by the number of reported cases and point to the security threats that will require the most vigilance in 2016.
1. Hacking or malware. Malware and electronic entry by an outside party remained the leading cause of data breaches for the second year in a row. Overall, 92 registered cases occurred because hackers gained unauthorized entry into a company's systems via Web app attacks, spyware, social engineering and Trojans. This includes the lion's share of all customer data compromised (around 195 million records).
2. Portable devices. The second most frequently reported type of cybercrime was the unauthorized access to information stored on portable devices, including laptops, smartphones and external hard drives. Overall, 28 security incidents investigated to date resulted in the loss of over 20,000 sensitive data records this year.
3. Unintended disclosure. The human factor is still a serious issue for data security. More than 38,000 records were exposed in 26 incidents due to employees' errors, such as misdirected emails and confidential information accidentally posted on companies' websites.
4. Insider misuse. Company employees or contractors with legitimate access to sensitive information posed a threat to data integrity in 11 cases. Despite the relatively small number of incidents, insiders caused significant damage and compromised more than 600,000 customer records.
5. Physical loss. Lost, discarded or stolen, non-electronic assets with sensitive information (e.g. paper documents), as well as card skimming and theft of stationary devices, became the root cause of data leaks in five cases, resulting in the loss of 1,100 records.
"Although we saw fewer security incidents than expected, the actual damage from data breaches is still substantial. What is even more disturbing, more than half of all breaches are still at the stage of investigation, and we don't know their real scope yet," said Alex Vovk, CEO and cofounder of Netwrix. "The huge number of compromised records shows that we are still lagging behind highly motivated intruders. We need to adopt a new cybersecurity mindset and combine conventional perimeter protection with deep visibility into our networks to gain better control over the IT infrastructure and minimize the risk of data loss."
About Netwrix Corporation
Netwrix Corporation provides IT auditing software that delivers complete visibility into IT infrastructure changes and data access, including who changed what, when and where each change was made, and who has access to what. Over 150,000 IT departments worldwide rely on Netwrix to audit IT infrastructure changes and data access, prepare reports required for passing compliance audits, and increase the efficiency of IT operations. Founded in 2006, Netwrix has earned more than 70 industry awards and was named to both the Inc. 5000 and Deloitte Technology Fast 500 lists of the fastest-growing companies in the U.S. For more information, visit www.netwrix.com.
Insurers’ competition and ongoing fight for market share resulted in a composite rate down 4% in December for the U.S. property and casualty market. But while market cycles are here to stay, the current cycles are tame compared to some previous years. In 2002, there was a mean average rate increase of 30% and, in 2007, a mean average decrease of 13%, according to MarketScout.
“Market cycles are part of our life, be it insurance, real estate, interest rates or the price of oil. Market cycles are going to occur without question. The only questions are when, how much and how long.” MarketScout CEORichard Kerr said in a statement. “While it may seem the insurance industry has already been in a prolonged soft market cycle, we are only four months in. The market certainly feels like it has been soft for much longer, because rates bumped along at flat or plus 1% to 1½% from July 2014 to September 2015.” He pointed out that the technical trigger of a soft market occurs when the composite rate drops below par for three consecutive months.
If you come across the name Booz Allen Hamilton, it’s usually in connection with defense-agency IT services contracts worth tens of millions of dollars. The tech consulting and engineering giant, more than 100 years old, is primarily in the business of solving big technology problems for government agencies, although it does also work in the private sector.
What you don’t see is Booz Allen mentioned in the context of open source technology. But that’s something that may soon change, as the company’s recently formed group charged with driving the giant’s participation in the open source community picks up speed. Most of this group’s work is focused on data centers and cloud, Jarid Cottrell, a Booz Allen senior associate who leads its cloud computing and open source practice, said.
The reason Booz Allen now has an open source practice is the same reason companies like GE, John Deere, Walmart, and Target dedicate resources to open source. Like the manufacturing and retail giants, Booz Allen’s customers in government and in the private sector want to build and run software the same way internet giants like Google, Facebook, or Amazon do, and they want the kind of data center infrastructure – often referred to as hyper-scale infrastructure – those internet giants have devised to deliver their services. Market research firm Gartner calls this way of doing things “Mode 2.”
Mel Gosling explains why he believes that business continuity needs a new way forward, and why the traditional business continuity plan no longer works for today’s organizations.
There is a growing body of business continuity practitioners that believe that a new approach to the discipline is both required and overdue. An example of this is the recent debate opened up by the publication of ‘The Continuity 2.0 Manifesto’ by David Lindstedt and Mark Armour.
I have recently added to that debate with a presentation to the November 2015 Business Continuity Institute’s BCI World conference entitled ‘The BC Plan is Dead!’, and in researching examples of companies that have stopped using traditional document based business continuity plans I have identified a set of key practices that I believe will drive the new approach. One of those companies, Marks and Spencer, gave an excellent practical demonstration at the end of my presentation of what they have managed to achieve with a new approach, ensuring that the audience understood that this is already happening and is not just a nice theory.
The Business Continuity Institute’s North America business continuity and resilience awards will take place on March 15, 2016, at DRJ Spring World 2016 in Orlando, Fla.
Entries are now open and this year’s categories include:
- Continuity and Resilience Consultant 2016
- Continuity and Resilience Professional (Private Sector) 2016
- Continuity and Resilience Professional (Public Sector) 2016
- Most Effective Recovery 2016
- Continuity and Resilience Newcomer 2016
- Continuity and Resilience Team 2016
- Continuity and Resilience Provider (Service/Product) 2016
- Continuity and Resilience Innovation 2016
- Industry Personality 2016
The deadline for entries is February 14th 2016.
To enter, click here.
Barrels of apples can go bad, both literally and figuratively, because of just one rotten apple. The rot spreads from one apple to another until the whole barrel is infected. Not so long ago (in 2014), experts from security company ESET discovered 25,000 servers infected with malware, some of these servers being grouped together in a network and infected together. The common factor was the installation of the Linux/Ebury malware, allowing login information to be harvested and communicated to the attackers that installed the malware. According to the experts, attackers needed to compromise just one server to then gain easy access to others in the same network. But was this one bad apple – or the whole lot?
Emergency response, information technology, and healthcare communications are three scenarios in which notification systems play a critical role. Recent disasters have demonstrated the benefits of crowdsourcing during response efforts, so notification systems are leveraging this responsiveness through two-way communication technology that can both disseminate and receive information.
The critical communications world continues to evolve, resulting in users taking a closer look at their existing notification systems to determine whether they remain effective tools for communicating crucial information. However, before these systems can be assessed, it is important to first understand a few of the ways these tools are being utilized, the challenges faced within each use case, and how, as we look forward to 2016, these hurdles can be overcome.
Toyota, the world’s largest automaker, is planning to build a data center specifically to collect and analyze data from cars equipped with a new type of Data Communication Module, an upcoming feature that will enable the company’s next-generation connected-vehicle framework, which will transmit data over cellular networks.
“To build the IT infrastructure needed to support this significant expansion of vehicle data processing, the company will create a Toyota Big Data Center (TBDC) in the Toyota Smart Center,” the company said in a statement. “TBDC will analyze and process data collected by DCM, and use it to deploy services under high-level information security and privacy controls.”
Connected cars are one of the new quickly growing sources of data expected to drive growth in demand for data transmission, storage, and processing capacity, collectively referred to as the Internet of Things.
Shlomo Kramer is the Co-Founder and CEO of Cato Networks.
The cloud revolution is impacting the technology sector. You can clearly see it in the business results of companies like HP and IBM. For sure, legacy technology providers are embracing the cloud. They are transforming their businesses from building and running on-premise infrastructures to delivering cloud-based services. The harsh reality is that this is a destructive transformation. For every dollar that exits legacy environments, only a fraction comes back through cloud services. This is the great promise of the cloud – maximizing economies of scale, efficient resource utilization and smart sharing of scarce capabilities.
It is just the latest phase of the destructive force that technology applies to all parts of our economy. Traditionally, technology vendors touted benefits such as personnel efficiencies and operational savings as part of the justification for purchasing new technologies – a politically correct way to refer to fewer people, offices and the support systems around them. This has now inevitably impacted the technology vendors themselves. Early indicators were abundant: Salesforce.com has displaced Siebel systems, reducing the need for costly and customized implementations; and Amazon AWS is increasingly displacing physical servers, reducing the need for processors, cabinets, cabling, power and cooling.
If businesses have been automating factories since the Carter administration, why is manufacturing the last acceptable data silo in so many companies? And when will that change?
Until recently, absorbing the factory floor into the enterprise has been too expensive and complex for all but the biggest companies.
Assuming the hardware (primarily sensors) and software needed to gather, disseminate and analyze manufacturing data continues to evolve at the current pace -- a safe assessment -- the mainstreaming of manufacturing integration should occur in less than a decade.