Spring World 2015

Conference & Exhibit

Attend The #1 BC/DR Event!

Fall Journal

Volume 27, Issue 4

Full Contents Now Available!

Jon Seals

CIO — Few deny that the healthcare industry in the U.S. faces tremendous pressure to change. Few deny the role that technology will play in stimulating this change, either.

Uncertainty creeps in, though, when healthcare organizations try to address their healthcare needs. This is especially true of healthcare providers — the hospitals, medical offices, clinics and myriad long-term care facilities that account for roughly 70 percent of healthcare spending and that have spent much of the 21st century rushing to catch up to other vertical industries.

Most providers, says Skip Snow, a senior analyst with Forrester, are "very new to the idea that they have all this structured data in clinical systems." That's largely because, until recently, the mission of the healthcare CIO was ancillary to a provider's core mission. IT often fell under the CFO's domain, Snow says, since it focused so much on business systems.

...

http://www.cio.com/article/750183/Forrester_Outlines_IT_Imperatives_for_Healthcare_Providers

It was recently revealed that the personal details of 10,000 asylum-seekers housed in Australia were accidently leaked via the Department of Immigration and Border Protection’s website. This has damaged asylum-seekers’ trust in the Australian government and, according to Greens Senator Sarah Hanson-Young, potentially put lives at risk. Such incidents represent significant breaches of local regulations and can result in heavy penalties.

Recent amendments to existing privacy laws in Australia and Hong Kong allow each country’s privacy commissioner to enforce significant penalties for repeated or serious data breaches. Countries like Japan and Taiwan, where new privacy laws have been passed and/or existing ones are being enforced more strictly, also assess penalties for noncompliance.

...

http://blogs.forrester.com/manatosh_das/14-03-24-what_asia_pacific_firms_must_learn_from_the_data_privacy_breach_in_australia

It’s funny how some myths continue to be believed, even by hard-nosed business people. The notion that virtualisation will save a company’s data is such a myth. Although it can be valuable in optimising an organisation’s use of IT resources and reacting quickly to changing IT needs, virtual environments are not inherently safer than independent physical servers. But data recovery provider Kroll Ontrack found that 80 percent of companies believe that storing data virtually like this is less or no riskier. Beliefs are one thing, statistics are another. 40 percent of companies using this virtual mode of storage were hit with data loss in 2012 – 2013. What’s going on?

...

http://www.opscentre.com.au/blog/why-server-virtualisation-is-not-a-disaster-recovery-plan/

Computerworld — Driven by a very strong belief in the future of software-defined data center technology, Bank of America is steering its IT to almost total virtualization, from the data center to desktop.

The technology does for the entirety of a data center what virtualization did for servers: It decouples hardware from the computing resources. Its goal is to enable users to create, expand and contract computing capability virtually, quickly and efficiently.

The software-defined data center is not yet a reality. But there are enough parts of the technology in place to convince David Reilly, Bank of America's global infrastructure executive, that it is the future.

"The software-defined data center is going to dramatically change how we provide services to our organizations," said Reilly. "It provides an opportunity for, in effect, the hardware to disappear.

"We think it's irresistible, this trend," said Reilly.

...

http://www.cio.com/article/750194/Bank_of_America_Sees_Software_Defined_Data_Centers_as_Irresistible_

Dell yet again signaled its intentions to compete more aggressively in the analytics space with the acquisition today of StatSoft.

With 1,500 customers, StatSoft is the second major analytics acquisition that Dell has made since acquiring Quest Software. In 2012, just prior to being acquired by Dell, Quest Software acquired Kitenga, a provider of high-end analytics software that usually gets applied to Big Data problems.

In contrast, John Whittaker, director of product marketing for Dell Information Management, says StatSoft represents a more mainstream play into the realm of predictive analytics. As there is definitely a blurring of the line these days between analytics applications, Whittaker says customers should expect to see Dell Software being significantly more aggressive in terms of delivering analytics capabilities into the midmarket.

...

http://www.itbusinessedge.com/blogs/it-unmasked/dell-buys-way-into-predictive-analytics-by-acquiring-statsoft.html

About a month ago, I reported on a study from Ponemon Institute and AccessData that revealed that most companies are doing a poor job when it comes to detecting and effectively responding to a cyberattack. As Dr. Larry Ponemon, chairman and founder of the Ponemon Institute, said in a statement when the report was released:

“When a cyber-attack happens, immediate reaction is needed in the minutes that follow, not hours or days. It’s readily clear from the survey that IR processes need to incorporate powerful, intuitive technology that helps teams act quickly, effectively and with key evidence so their companies’ and clients’ time, resources and money are not lost in the immediate aftermath of the event.”

AccessData’s Chief Cybersecurity Strategist, Craig Carpenter, has been looking at this problem in some depth. We aren’t totally clueless on why these attacks are able to cause tremendous amounts of damage, both financial and reputational, to companies. For example, as information about the Target breach continues to trickle out, we have a pretty good idea of how and why the incident occurred. Our concern now, Carpenter said in a blog post, is fixing these problems. The key, he said, is prioritization and improved integration. In an email to me, Carpenter provided a few steps every company should take to prevent a “Target-like” breach in the future:

...

http://www.itbusinessedge.com/blogs/data-security/improving-cyberattack-response.html

InfoWorld — Apache Cassandra is a free, open source NoSQL database designed to manage very large data sets (think petabytes) across large clusters of commodity servers. Among many distinguishing features, Cassandra excels at scaling writes as well as reads, and its "master-less" architecture makes creating and expanding clusters relatively straightforward. For organizations seeking a data store that can support rapid and massive growth, Cassandra should be high on the list of options to consider.

Cassandra comes from an auspicious lineage. It was influenced not only by Google's Bigtable, from which it inherits its data architecture, but also Amazon's Dynamo, from which it borrows its distribution mechanisms. Like Dynamo, nodes in a Cassandra cluster are completely symmetrical, all having identical responsibilities. Cassandra also employs Dynamo-style consistent hashing to partition and replicate data. (Dynamo is Amazon's highly available key-value storage system, on which DynamoDB is based.)

...

http://www.cio.com/article/750171/Cassandra_Lowers_the_Barriers_to_Big_Data

“If you’re not paranoid, you’re not paying attention.” It’s an old joke, but one that rings true as I finish my presentation for this Wednesday’s online webinar with The Disaster Recovery Journal. Here are just three of the danger signals from the 2014 Annual Report on the State of Disaster Recovery Preparedness that I’ll describe during the webinar.

DANGER SIGNAL 1: 3 out of 4 companies worldwide are failing in terms of disaster readiness. Having lots of company will be no consolation for organizations that have failed to respond to the alarming rise in intentional and accidental threats to IT systems.

DANGER SIGNAL 2: More than half of companies worldwide report having lost critical applications or most/all datacenter functionality for hours or even days. Once again, more evidence that business is at-risk for crippling losses.

DANGER SIGNAL 3: Human error is the #2 cause of outages and data loss, reported by 43.5% companies reporting in. How does your disaster recovery plan address this key vulnerability?

The good news? There are specific actions you can take right now to be better prepared to recover your systems in the event of an outage.

...

http://drbenchmark.org/webinar-this-week-the-state-of-disaster-recovery-preparedness/

SUNNYVALE, Calif. – With much industry attention on “software- defined” infrastructures, organizations considering software-defined storage are faced with multiple approaches to using software to pool, aggregate, manage and share storage resources while providing high levels of data integrity, availability and scalability.

A software-defined environment leverages software, executing on industry-standard servers to pool and abstract hardware resources, thereby providing high levels of automation, flexibility and efficiency. It also enables convergence of compute and storage on the same set of standard servers. 

Maxta, a developer of the VM-centric storage platform MxSP™ that turns standard servers into a converged compute and storage solution for virtualized environments, offers the following five tips for evaluating software-defined storage:

1. Does Software-Defined Storage work?

Yes. Convergence of compute and storage and software-defined storage have been field-proven in leading companies – Facebook, Amazon, and Google, to name just a few. Yet, in many cases, software-defined storage was developed for specific workloads, mostly Big Data rather than for general enterprise workloads. With the rise in virtual infrastructures in environments of all sizes and the fact that server virtualization provides a natural platform for convergence, it’s clear that the combination of server virtualization and software-defined storage is the direction for enterprise workloads.

2. Does Software-Defined Storage make sense for me? 

Software-defined storage reduces costs and operational complexity while streamlining IT. It offers efficiency, simplicity, agility and availability benefits. There are few IT environments where these benefits are not desirable. 

3. Am I capable of implementing Software-Defined Storage?

Software-defined storage is simple to implement and configure. It should be a seamless, plug-and-play integration in any virtualized environment, and should not require customization, day-to-day management, or any special storage/networking competency. In most cases, a single administrator can manage both compute and storage resources. 

4. Can I use my existing storage and servers to deploy Software-Defined Storage?

Yes. Software-defined storage solutions should seamlessly co-exist with the existing infrastructure. You should be able to use your existing servers as the platform for software-defined storage and your existing storage arrays for all the applications that are already using them. Moreover, software-defined storage should be able to leverage existing storage arrays for capacity with all the benefits of managing all storage resources with a single-pane-of-glass console that manages storage assets as simply as VMs.

5. Do I have to sacrifice enterprise-class data services to implement Software-Defined Storage?

No. Software-defined storage implementations should provide all enterprise-class services, such as data sharing, live migration of virtual machines, dynamic load balancing, high availability, disaster recovery, snapshots, clones, thin provisioning, inline compression and data deduplication. These services should be delivered on industry standard servers alongside the server virtualization software and applications. 

Maxta provides proven software-defined storage for any virtualized data center environment that is easy to implement and manage. The Maxta Storage Platform delivers significant capital and operational cost savings, while dramatically simplifying IT. It aggregates and converges dispersed compute and storage resources on standard commodity hardware, and provides all the enterprise-class capabilities required by the virtual data center. 

Follow Maxta on Twitter, LinkedIn and Facebook

About Maxta
Maxta is redefining enterprise storage by delivering a storage platform that is simple, agile and cost efficient for virtualized environments as well as supporting a wide spectrum of enterprise-class data services and capacity optimizations. For more information visit http://www.maxta.com.











NEWARK, Calif. – Tegile Systems, the leading provider of flash-driven storage arrays for virtualized server and virtual desktop environments, today announced that it has won Info-Tech Research Group’s Trend Setter Award and was listed as an Innovator in Info-Tech’s Small to Mid-Range Storage Arrays Vendor Landscape report.

Info-Tech Research Group Vendor Landscape reports recognize outstanding vendors in the technology marketplace. Assessing vendors by the strength of their offering and their strategy for the enterprise, Info-Tech Research Group Vendor Landscapes pay tribute to the contribution of exceptional vendors in a particular category. Info-Tech’s Trend Setter Award is presented to the most original/inventive solution evaluated.  Those designated Innovators have demonstrated innovative product strengths that act as their competitive advantage in appealing to niche segments of the market.

“Tegile has turned heads with the Zebi Storage Array’s Metadata Accelerated Storage System,” according to the Info-Tech Research Group Research Document Info-Tech Research Group, Vendor Landscape: Small to Mid-Range Storage Arrays, 2014. “Tegile’s differentiators are its deduplication and compression technologies, which increase usable storage while driving performance even on spinning disk media. The Zebi series is a good fit for organizations that demand high performance for VDI and analytic initiatives.”

Zebi storage arrays employ an all-flash storage with hybrid twist that leverages the performance of SSD and low cost per TB of high capacity disk drives, delivering as much as seven times the performance and up to 75 percent less capacity required than legacy arrays.  This unique approach has seen marked adoption rates among companies that need faster performance than HDD-based arrays but with less expense than SSD-based arrays.

“We are pleased to be recognized as both a Trend Setter and Innovator by Info-Tech Research Group as part of the select group of vendors evaluated in the organization’s Small to Mid-Range Storage Arrays Vendor Landscape report,” said Rob Commins, vice president of marketing at Tegile. “We’ve worked diligently to offer customers a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price for virtualization, file services and database applications.  Having that hard work vetted by industry analysts with the resulting research offered to IT leaders looking to implement the most appropriate storage array for their organization is especially rewarding.”

About Info-Tech Research Group 
With a paid membership of over 30,000 members worldwide, Info-Tech Research Group (www.infotech.com) is the global leader in providing tactical, practical Information Technology research and analysis. Info-Tech Research Group has a sixteen-year history of delivering quality research and is North America's fastest growing full-service IT analyst firm.

About Tegile Systems
Tegile Systems is pioneering a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price for virtualization, file services and database applications. With Tegile’s Zebi line of hybrid storage arrays, the company is redefining the traditional approach to storage by providing a family of arrays that is significantly faster than all hard disk-based arrays and significantly less expensive than all solid-state disk-based arrays. 

Tegile’s patent-pending MASS technology accelerates the Zebi’s performance and enables on-the-fly de-duplication and compression of data so each Zebi has a usable capacity far greater than its raw capacity. Tegile’s award-winning technology solutions enable customers to better address the requirements of server virtualization, virtual desktop integration and database integration than other offerings. Featuring both NAS and SAN connectivity, Tegile arrays are easy-to-use, fully redundant, and highly scalable. They come complete with built-in auto-snapshot, auto-replication, near-instant recovery, onsite or offsite failover, and virtualization management features. Additional information is available at www.tegile.com. Follow Tegile on Twitter @tegile.