According to the survey conducted by the Ponemon Institute, some 71 percent of employees report that they have access to data they should not see, and more than half say this access is frequent or very frequent.
In the words of Dr. Larry Ponemon, chairman and founder of The Ponemon Institute:
One of the side effects of the consumerization of IT is that some end customers are feeling more empowered than ever to take IT matters into their own hands rather than seek the help of IT solution providers. This is especially true when it comes to cloud services, where business owners (or their employees) can self-install a cloud backup product and instantly have access to 5 GB or more of free cloud storage. Even if business owners aren't actively involved in using or promoting DIY (do-it-yourself) cloud services, research shows their employees are. A study from Skyhigh Networks, which monitors the use of cloud services for businesses, found that the average enterprise uses 545 cloud services, which is approximately 500 more than the average CIO is aware of!
Besides the loss of control of corporate data, DIY cloud services play into the hands of cybercriminals who exploit business owners through ransomware. Like other malware, ransomware infects corporate networks through unpatched computers or when a user clicks on an infected email attachment. Once launched, the ransomware program encrypts common user files on the network--such as documents, spreadsheets and database files--and the victim is required to pay a ransom to decrypt the files.
IT security may be an MSP’s core offering or one of several lines of business. But regardless of its business model, a service provider should take stock of the current threat landscape. MSPs need to know what’s out there if they hope to help clients mitigate their security risks.
What are your customers up against? In 2014, they endured the perfect malware storm. Consider the following:
I recently interviewed a technology start-up that claimed they were already profitable, with only a few clients and a few months out the door. I have no way to verify or deny that, but I can tell you this: The entire product is built around open data.
In fact, its founders adamantly refused to let me call it a technology company, which is just one of many reasons I’m not revealing its name.
“Our product is the data,” one VP repeatedly told me.
That’s a bit of a bold claim for a company based on government-released data and other open data sets. If it were really the data, and everybody has access to the data, then what’s the point?
(TNS) — Think the Napa fault stopped moving after producing a 6.0 earthquake in August? Think again.
The fault that caused that Napa quake is forecast to move an additional 2 to 6 inches in the next three years in a hard-hit residential area, a top federal scientist said at a meeting of the American Geophysical Union in San Francisco on Tuesday.
It is the first time scientists have formally forecast the gradual shifting of the ground in a residential area after an earthquake.
“Until the South Napa earthquake happened, we had not clearly foreseen just what a problem that could be,” U.S. Geological Survey geophysicist Ken Hudnut said.
It is fascinating to watch a new class of software be born. This doesn’t seem to happen that often anymore, but every once in a while a customer or a vendor discovers a gap in the current offerings and fills that gap with something we have never seen before. I recently ran into an event like this at BMC Engage. BMC has a write-up that subtly points to the impending creation of this new security automation product class. And last week, I spoke to Tony Stevens, who works for the Department of Technology, Management and Budget at the State of Michigan and is helping husband the birth of this class. Let’s talk about that this week.
Have you ever thought about all the information your appliances tell you? The world is moving toward presenting instant data about every aspect of life. For example, there is now an electric toothbrush with Bluetooth capabilities that can record your brush strokes and let you chart your dental hygiene activities on a smartphone app. Home sensor products not only tell you if your teenager is trying to sneak out at night, but also how many times someone has been dipping into the cookie jar. And many of us can’t even exercise anymore without a fitness band and apps that record every step, every calorie expended, and every turn in our sleep.
While some of that real-time data is great to have, we’re also reaching a point of TMI … “too much information,” or data overload. How much is too much real-time data? Only you can answer that for your personal data needs, but I do know there is one area where there is never enough real-time data. That is in your company’s disaster recovery plan.
Think about a disaster striking your business. You could have all your subject matter experts in place, but if they can’t access data or if your recovery strategy isn’t complete, nothing will work. The consequences could be nothing short of catastrophic: for the vast majority of companies, once they have to shut down because of server problems or another disaster, they aren’t able to recover in a timely fashion. And let’s face it … a faltering or incomplete recovery can spell death for a business.
To customers, the cloud often seems like an ideally flexible application and data storage solution. On the other hand, starting as a cloud provider often requires very deep pockets. As a result, not every provider stays the course. And if under-capitalisation doesn’t kill a provider off, there is always the danger of a marketing failure that persuades backers to pull the plug. The irony of the situation is that many customers want to make their cloud provider a strategic part of their disaster planning. However, customers must then also extend their plan to include the possibility that the provider itself is the disaster.
Data is the lifeblood of the modern enterprise, and as with most complex organisms, loss of blood can lead to weakness and death.
So it is no wonder that data recovery has emerged as a top priority as the enterprise finds itself trusting third-party providers for the care and maintenance of their lifeblood to an ever greater degree.
According to Veeam Software, application and data downtime is costing the average enterprise about $2 million per year, with the vast majority of that cost attributed to the failure to recover data in a reasonable amount of time. This usually presents a double-edged sword for IT, though, as the pressure to improve recovery times is often accompanied by the reluctance of the front office to invest in adequate backup and recovery (B&R) infrastructure. This also affects permanent data loss, as many organizations maintain backup windows and restore points that fail to account for the massive accumulation of potentially critical data in a relatively short time.
The cloud has done a lot to relieve the burden, financial and otherwise, of wide-scale B&R. In fact, this is one of the primary drivers of IaaS, according to ResearchandMarkets, in that it provides a ready platform to not only integrate backed-up data into dynamic production environments, but to maintain a duplicate IT infrastructure should primary resources go dark. IaaS also puts these capabilities within reach of the small-to-midsize enterprise.