As data becomes more fungible, that is, less engaged with the physical infrastructure that supports higher level virtual and cloud architectures, the overall data environment starts to exhibit new characteristics, some of which will dramatically alter the way in which those environments are built and operated.
Of late, the concept of data gravity has been showing up in tech conferences and discussion groups. Coined by VMware’s Dave McCrory about four years ago, it describes the way data behaves in highly distributed architectures. Rather than becoming evenly distributed across a flattened fabric, data tends to collect in pockets, with smaller bits of data gravitating toward larger sets the same way that particles coalesced into galaxies after the Big Bang. Part of this is due to the nature of distributed architectures where the farther away storage is from processing centers and endpoints, the greater the cost, complexity and latency. But it is also a function of the data itself, particularly now that all information must be “contextualized” with reams of metadata for it to be useful.
What should you consider before using the cloud for disaster recovery? Martin Welsh and Patricia Palacio provide some guidance.
Whatever the company size or industry, the truth is that your business can't afford downtime but traditional DR strategy investments have been difficult to justify. The majority of organizations attempt to protect only mission critical applications, leaving second-tier, but still valuable, systems vulnerable to extended outages. It's hard to justify improving your disaster recovery capabilities when you're under pressure to cut IT costs and when DR is seen as an expensive insurance policy.
The major challenges faced when planning your disaster recovery strategies are:
Ian Kilpatrick describes six emerging technology trends that will need consideration during 2014:
Thanks to the NSA and GCHQ, (coupled with ongoing allegations against the Chinese), security, corporate privacy and encryption have moved swiftly up the corporate agenda. Identity management, which has often been seen as a ‘nice to have’, will become even more of a ‘must have.’
For many years, wireless security was an afterthought to wireless deployment. However, in 2014, with the ratification of multi GBPS 802.11ac, wireless security will become ever more important as organizations move from wired networks to wireless ones.
As one example, the majority of wireless access point deployments in SMEs are connected to the trusted network, effectively bypassing the gateway security controls and policies. This isn’t sustainable, as wireless becomes the core of the network. There will be a rise in the deployment of both 802.11ac wireless and associated access point security.
Although the dust hasn’t yet settled on the Edward Snowden revelations about the activities of the US National Security Agency, the consequences already extend beyond the purely technical. While the immediate reaction was to think of better ways in which to encrypt data, it also dawned on foreign organisations that they might want to review certain business relationships. The idea that the NSA could have direct backdoors into many US companies dampened the enthusiasm of certain international entities to continue trading with them. But will American enterprises alone have to increase their efforts to maintain business continuity, or are companies in other countries affected too?
New capacity, rate reductions and competition are a few factors contributing to a softer market and an 11% drop in reinsurance rate on line—a calculation of reinsurance premium divided by reinsurance limit—almost across the board, according to Guy Carpenter.
Much of this was driven by a decline of 15% in the United States, while property catastrophe pricing in Continental Europe and the United Kingdom fell by 10% and 15%, respectively, Guy Carpenter said.
Willis Re said in its “1st View” report that soft market conditions are not unique to the property catastrophe market. The report found that “with few exceptions rates are down on most lines at Jan. 1.”
One of the last things I wrote about in 2013 was the Target breach. I suspect that breach is going to linger for a while, not only for customers but for businesses that (I hope) are now thinking a lot more about the security of their credit card systems and their computer networks overall. I know one small business owner is, because she asked me the types of questions she should ask regarding the security of her system. (And those questions may be a blog post for another day.)
Right before I went on holiday break, I had an email conversation with some folks from Guidance Software regarding the Target breach and the forensic investigation into what happened. One of the first things I was told was that we shouldn’t have been surprised that this breach happened because it was inevitable. As Jason Fredrickson, senior director of application development at Guidance Software, told me:
CIO — In the world of IT, things can and will go wrong. Failure can come from a number of things such as rushing to get too much done in a single project instead of breaking it down into smaller, more manageable projects. It can come from not allowing enough lead time for developers to do their part on the back-end or even from a consultant or vendor that led you down the wrong path.
Whatever the case, failure does happen; it's to be expected and as the saying goes life is "10 percent what happens to you 90 percent how you react to it." Failure doesn't have to be a negative. With the right attitudes and processes in place it can be educational, informative and sometimes transformative.
You know from a logical perspective that you should learn from your mistakes. That is drilled into many of us beginning in childhood. The problem, according to experts, is that in the corporate world, a lot of companies don't handle failure well. They don't have adequate processes in place to examine why something failed, but that is a huge necessary part of the learning process.
Mobile CRM, which has been gaining momentum for quite some time, is a trend that will only get hotter in 2014, experts predict. Among other trends they expect to take root or accelerate in 2014: social CRM, more integration and smarter CRM.
Most industry observers agree that the adoption of mobile will be a dominant CRM theme in 2014 as companies look for ways to extend CRM capabilities to give employees convenient, always-on access to sales content, allowing them to address customer needs and collaborate with sales teams in real-time.
"CRM capabilities will be integrated into mobile tools to generate leads and opportunities both in-store and on the road," said Chris O'Connor, founder and CEO of Taptera. "We see companies that are using CRM continue to invest in out-of-the-box solutions through extension into mobile channels and customization to monitor, manage and drive leads, conversions, shorten sales cycles and improve customer support."
The arrival of the first major winter storm of 2014 just two days into the new year makes this a good time to take stock of the insurance implications.
The Insurance Information Institute (I.I.I.) reports that winter storms are historically very expensive and are the third-largest cause of catastrophe losses, behind only hurricanes and tornadoes.
From 1993 to 2012, winter storms resulted in about $27.8 billion in insured losses—or $1.4 billion per year, on average, according to Property Claims Service for Verisk Insurance Solutions (see chart below).
Some of the best Big Data and sensor uses come from the manufacturing and logistics world. But while supply chains and manufacturing floors can generate plenty of important business data, those functions aren’t always the best equipped to use that data.
Operations, supply chains and manufacturing are due for a technology overhaul, according to IDC Manufacturing Insights and other analysts who research these B2B functions.
The problem: Supply chain technologies and processes lag behind the highly digital world of the business side.