Spring World 2015

Conference & Exhibit

Attend The #1 BC/DR Event!

Summer Journal

Volume 27, Issue 3

Full Contents Now Available!

Jon Seals

Here are my predictions for 2014:

  1. 2014 will bring exponential expansion and evolution of the Internet of Things (IoT).
    This will also bring new opportunities for information security trailblazers unlike any we’ve seen before. The potential benefits of the IoT will be huge, but just as large will be the new and constantly evolving information security and privacy risks. We will see some significant privacy breaches resulting from the use of IoT devices as a result. New IoT risks, and resulting security incidents and privacy breaches, will bring a significant need for technology information security pros to also understand privacy concepts so they can implement privacy protections within all these new devices, and into the processes and environments where the devices are used. Even though basic information security and privacy concepts will still apply, very little has been done to actually implement security or privacy controls in these new technologies. We will need more information security and privacy professionals who can recognize new information security and privacy risks. There is no textbook to look to for these answers as risks evolve.



Cloud storage providers want your business, and they are actively exploring numerous strategies to get it.

However, catering to professional organizations is much different than catering to individuals, even if those individuals use their personal clouds to house business data. And the provider, or providers, who can establish robust, enterprise-friendly storage environments will reap a substantial reward as organizations look to scale infrastructure in order to take on Big Data and other challenges.

This is why so many cloud providers are introducing a wide range of top-tier storage features in their platforms. Box, for example, recently added a new administration console that aims to extend visibility and control into its hosted environment. The system includes protections for personal data like credit card numbers and Social Security information, as well as data and traffic analysis tools to help organizations better manage resource consumption and red-flag unusual usage patterns. There are also new automation and content management suites with improved workflow and search functions.



When is the last time you personally experienced a hard drive failure?

A few years ago, thieves broke into our RV and stole the laptops, hard drives, and basically anything not nailed down.

At the time, I had a backup strategy - but pushed the backup and swap by two days (after the weekend). As a result of that fateful decision, I lost a few weeks of work and a few gigabytes of pictures. I recreated the work, but the pictures are gone.

I learned the importance of sticking to the backup plan, having multiple backups (in different locations), and never leaving a phone with a laptop. Never.

Last summer, as the hard drive on my roughly four year old laptop signaled it was failing, I was ready. I had a backup. And to be safe, I had a backup of my backup.

- See more at: http://blogs.csoonline.com/security-leadership/2874/using-evidence-hard-drive-failure-backblaze-increase-value-security#sthash.DUEc8wMl.dpuf

When is the last time you personally experienced a hard drive failure?

A few years ago, thieves broke into our RV and stole the laptops, hard drives, and basically anything not nailed down.

At the time, I had a backup strategy - but pushed the backup and swap by two days (after the weekend). As a result of that fateful decision, I lost a few weeks of work and a few gigabytes of pictures. I recreated the work, but the pictures are gone.

I learned the importance of sticking to the backup plan, having multiple backups (in different locations), and never leaving a phone with a laptop. Never.

Last summer, as the hard drive on my roughly four year old laptop signaled it was failing, I was ready. I had a backup. And to be safe, I had a backup of my backup.

- See more at: http://blogs.csoonline.com/security-leadership/2874/using-evidence-hard-drive-failure-backblaze-increase-value-security#sthash.DUEc8wMl.dpuf


By Rev. David L. Myers, Director, DHS Center for Faith-based and Neighborhood Partnerships meeting with tornado survivor

History is a great teacher.

Associate Pastor Ben Davidson of Bethany Community Church learned a valuable lesson during Hurricane Katrina in 2005 that benefitted him and his congregation the morning of Nov. 17, 2013, when a powerful tornado tore through Washington IL.

His quick thinking reminds me when disasters occur; having a plan can save lives and help pivot a community toward a strong recovery. I have learned this lesson many times through the faith leaders I’ve engaged as director of the DHS Center for Faith-based & Neighborhood Partnerships.

On Sunday morning Pastor Davidson was preparing to begin his adult Sunday school class, when he received an emergency phone call.  A tornado had touched down and their church was in its path.
Immediately he and the staff worked to move the congregation --particularly the children -- to their designated shelter in the church location and they began to pray together as the storm passed through their community.

The entire congregation comforted one another through what Pastor Davidson recalls as "the longest 45 minutes of my life." Once all congregants were accounted for and that families could leave the sheltered location Pastor Davidson immediately went home to confirm the safety of his children who were at home sick that morning.

Immediately following the disaster, Bethany Community Church joined its fellow members of the Washington Ministerial Association, AmeriCorps and the Illinois Voluntary Organizations Active in Disaster to help coordinate the community’s recovery efforts.

meeting with pastor in washington illinois

Since the devastating event, more than 4,000 community volunteers have registered with Bethany Community Church to help their loved ones and neighbors during disasters.  Their effort and commitment will help to increase the community’s resilience and ensure they are better prepared for emergencies.

The story of Washington, IL, and Bethany Community Church is a reminder of the care and compassion that faith-based organizations can provide all survivors in times of disaster. Their story reinforces the power of a whole community, “survivor centric” approach and the important role and responsibility of faith leaders in preparing their communities before disasters strike.

I encourage you know what to do before disaster strikes by joining the thousands of faith-based and community members on the National Preparedness Coalition faith-based community of practice and connecting with faith leaders across the country working on preparedness.

Being prepared contributes to our national security, our nation’s resilience, and our personal readiness.

CIO — It's the time of year when darkness comes early and people begin to sum up how this year has gone and next year will unfold. It's also the time of year that predictions about developments in the technology industry over the next 12 months are in fashion. I've published cloud computing predictions over the past several years, and they are always among the most popular pieces I write.

Looking back on my predictions, I'm struck not so much by any specific prediction or even the general accuracy (or inaccuracy) of the predictions as a whole. What really comes into focus is how the very topic of cloud computing has been transformed.

Four or five years ago, cloud computing was very much a controversial and unproven concept. I became a strong advocate of it after writing Virtualization for Dummies and being exposed to Amazon Web Services in its early days. I concluded that the benefits of cloud computing would result in it becoming the default IT platform in the near future.



AUSTIN, Texas – As cloud storage technology is increasingly adopted for a broad range of consumer and business applications, the storage landscape is transitioning away from traditional disk arrays to object-based storage systems. A recent IDC report predicts that the market for File- and Object-Based Storage (FOBS) will experience an annual growth rate of 24.5% through 2017, reaching $38 billion. “Increased versatility will result in more diverse use cases for FOBS,” said IDC. 

Software-based object storage is not saddled with the cost, complexity and vendor lock-in of legacy storage arrays or the scalability limitations of traditional file system storage. 

Experts at Caringo®, the leading provider of object storage software, offer the following six reasons object storage offers the scalability, availability, resiliency and accessibility for cloud-scale storage.

1. No Single Point of Failure: 
The most efficient object storage systems are built on a symmetric architecture where all nodes run the same code, resulting in high availability and unprecedented scalability, eliminating any single point of failure. 

Why this matters: When you hear management node, controller node, or database, this means more management and the addition of additional points of failure that can critically impact performance, stability and fault tolerance. In highly available object storage solutions, all nodes do the same thing, so that if one fails the others can immediately remedy the issue. This also eliminates the need for specialized hardware that needs to be physically shipped if an issue is discovered.

2. Flexible Data Protection on a Per-Object Basis:
Data protection flexibility is critical as no one data protection scheme works for every use case. Object storage systems need both replication and erasure coding, as well as the ability to move between the two, all in the same cluster to ensure comprehensive, efficient data protection. 

Why this matters: Different environments and even different objects require different combinations of replication and erasure coding. Object storage solutions that limit the flexibility of changing from one protection scheme to the other, or lock the protection scheme to specific hardware, ultimately hinder growth and the ability to optimize resources. Support for both protection schemes on the same server means you can ensure access, data protection and resource utilization system wide – without constraints.

3. Support for Large and Small Files: 
Object stores must be designed to handle a broad range of applications and workloads without performance impact, and be equally capable of storing and accessing billions of small files, documents, and emails – or very large files like high-definition videos.  

Why this matters: Functionality regardless of file size is important to ensure performance. While compression algorithms get more efficient in making files smaller, continued technological advancements will result in larger files. An object storage solution delivers rapid access and efficient storage, regardless of file size or object count. 

4. Granular, Automated Scalability: 
Best-in-class object stores are highly scalable, allowing the addition of a single disk all the way up to multiple nodes to extend the capacity or performance.  

Why this matters: Granular scalability lets you scale as you grow and eliminates the need to over purchase hardware because of the storage system’s technical limitations.

5. Continuous Integrity Checks and Fast Volume Recovery:
Best-of-breed solutions continuously check content integrity based on both the data protection schemes in use and the integrity of the content itself. If a bad disk is discovered, recovery should be distributed, with the rate of repair accelerating as the storage solution grows.

Why this matters: Content should always be available. Unfortunately this is an area where some object storage solutions need improvement. Some only check content integrity on reads, which is not the optimal time to ensure data integrity. Others employ specialty nodes to identify and repair issues, which limits scale and creates bottlenecks.

6. Instant Content Look-up and Retrieval:
Best-of-breed solutions allow queries against the object store based on object attributes or customizable metadata “tags” stored with the object. Because metadata is stored with the object, content is self-contained and security, authentication, and all other identifying information is always available regardless of application, employee turnover, technological obsolescence or even time. 
Why this matters: As the amount of content grows from millions to billions of objects and management resources change (hardware migration and employee turnover), efficient content look-up and retrieval becomes a challenge. Some object solutions store metadata in a separate database, which introduces an additional layer of complexity between content requests and content delivery – a textbook bottleneck. Databases also become unwieldy with size and require investment in specialized management resources. 
To learn about how to evaluate cloud storage options, identify the commonalities and differences among solutions and get a cheat sheet to assist in your evaluation, click here for an on-demand webinar hosted by Caringo. 

Follow Caringo

About Caringo
Caringo develops object storage software that gives you control over any volume, flow or size of unstructured information; dramatically reducing complexity and costs while extracting maximum value and performance from hardware. Caringo's unique benefits are delivered through a symmetric architecture that enables massive scalability, elastic content protection, and automation of management in a comprehensive software suite. The result is the industry's most efficient object storage software ideal for cloud storage, big data, and active archives in any industry. 

Kroll Ontrack reveals latest trends in data recovery 

MINNEAPOLIS, Minn. – The continuing proliferation of new drive types and the ever-growing problem of malware were among the biggest trends impacting the data recovery industry in 2013, according to year-end information from data recovery and ediscovery products and services provider Kroll Ontrack. The trends further underscore the need for businesses and consumers to understand how evolving technology affects their ability to protect and recover critical data.

Solid state drives (SSD) & other flash devices: Dozens of different manufacturers, all with unique technology  

As prices for SSD and other flash devices continue to decrease and align more closely with hard drive prices, nearly 10 percent of Kroll Ontrack recoveries are now flash media. Beyond a greater percentage of SSD and other flash-based recoveries, Ontrack Data Recovery engineers grappled with new drive formats, such as hybrid drives, which contain both SSD and spinning drive components. Hybrid drives promote operation optimization and tiering, storing more frequently accessed hot data on the faster SSD and less accessed data on the slower spinning portion of the drive or utilize the flash-based portion as a cache.

“With SSD and other flash standards still evolving, each new drive format is specific to the manufacturer and therefore requires a new just in time (JIT) data recovery toolset and methodology, which impacts recovery speeds and quality,” said Troy Hegr, data recovery technology manager, Kroll Ontrack. “With that in mind, regular backups are critical. Further, SSD and other flash device users should download the useful manufacturer’s software tools from their website to optimize and monitor the health of the drive.”

Hard Drives: Greater capacity requires new approaches to data recovery
SSD and other flash media weren’t the only storage media on the cutting edge in 2013. Leading hard drive manufacturers innovated to pack more capacity into drives. For example, Hitachi built helium-filled drives. With less dense air, hard drive heads fly more freely with less resistance, giving Hitachi the ability to put their platters closer together and thus pack more platters into their drives. In contrast, Seagate is increasing hard drive capacity through shingled magnetic recording (SMR) technology, which stores data bits in overlapping versus linear patterns.

“The impact on data recovery from these newer technologies is yet to be determined,” said Hegr. “For example, opening a helium-filled drive in a cleanroom environment could cause the drive heads to crash more easily and make data recovery much more challenging. We are therefore closely watching these technology developments, and testing various methods to safely and effectively address them in a cleanroom environment.”

Viruses: New malware impacts data accessibility

In 2013, the CryptoLocker virus was born, hijackingcomputers and networks in an exchange for ransom. CryptoLocker is a Trojan horse malware, a form of ransomware, targeting computers running Windows®. The attack usually comes disguised as a legitimate email attachment. When activated, the malware encrypts certain types of files with the private key stored only on the malware's control servers and displays a message which suggests the data can be decrypted for payment by a certain deadline. If the deadline passes, the warning message threatens that the private key will be deleted and data is unrecoverable. However, virus victims have been able to unlock their files after the initial time is up, but the cost has been incrementally more than the original ransom requested.

“This virus has unfortunately succeeded because the cost of downtime to businesses can be as detrimental as $5,600 a minute, according to the Ponemon Institute, and therefore businesses are finding it is cheaper and more efficient to cater to the demands of these hackers,” said Abhik Mitra, data recovery product manager, Kroll Ontrack. “Criminals clearly understand how valuable data is to businesses and individuals. The takeaway is to be aware of suspicious emails, and take the extra step of backing up in case you fall victim to these scams.”

Encryption: Leveraging data recovery expertise to validate security

While customers turned to Kroll Ontrack to reverse the impact of viruses like CryptoLocker, data storage companies proactively looked to Kroll Ontrack in 2013 to do the reverse – test, validate and certify the effectiveness of the encryption integrated into storage products to ensure no one can get unauthorized access to the data. For data protection, encryption is a must and thus becoming more commonplace. However, encryption presents an additional layer of recovery complexity because the encryption key is required. With software encrypted drives, such as those using Microsoft BitLocker, Check Point PointSec, McAfee Safeboot and others, the user holds the key and can supply it to the data recovery company when needed. This is in contrast to hardware encrypted drives, such as Secure Encrypted Drives (SED) or Full Disk Encryption (FDE), where the key is built right into the drive. If a hardware encrypted drive becomes corrupted or malfunctions due to physical, logical or electrical issues, the key is essentially locked in the drive, requiring data recovery engineers to bypass the failure to get the drive working and then decrypt the data as part of reading the drive. For these reasons, Kroll Ontrack is focusing more of their research and development efforts towards dealing with encrypted data more efficiently.

Do-it-yourself: Tech savvy consumers are increasingly attempting data recovery

In 2013, Kroll Ontrack also saw a continued increase in the number of users taking it upon themselves to recover data. In fact, more than 10 percent of the time, Kroll Ontrack saw drives that showed signs of data access attempts, which can hinder recovery efforts.

“DIY software is a cost-effective and proven solution for individuals and businesses that are both willing and comfortable to try data recovery on their own,” said Mitra. “The key is knowing when software is applicable to the situation. If physical damage to the drive is obvious, the operator should power down the drive and consult a professional data recovery company to avoid any further data loss.”

About Kroll Ontrack Inc.

Kroll Ontrack provides technology-driven services and software to help legal, corporate and government entities as well as consumers manage, recover, search, analyze and produce data efficiently and cost-effectively. In addition to its award-winning suite of software, Kroll Ontrack provides data recovery, data destruction, electronic discovery and document review. For more information about Kroll Ontrack and its offerings please visit: www.krollontrack.com or follow @KrollOntrack on Twitter.

TAMPA, Fla.  – ReEmployAbility, the largest national provider of early return-to-work (RTW) services and transitional employment programs, announced results from a return-to-work (RTW) program with Waste Management, the largest environmental solutions provider in North America.  ReEmployAbility’s Transition2Work® Program helped Waste Management supplement its in-house RTW program, successfully assisting injured workers whom Waste Management could not accommodate with modified light duty assignments. 

“We have a very successful Transition to Recovery program that reduced our daily TTD headcount significantly, but we flat lined,” said Bob Drew, Director of Risk Management for Waste Management. “ReEmployAbility had a different spin—sending people with restrictions to work at non-profits. This allows us to accommodate injured workers with restrictions in areas where we don’t have any appropriate opportunities, and the results have been fantastic!”

The Transition2Work program places injured workers with local non-profit organizations to perform modified light duty assignments. The employer pays wages to the injured worker while at the non-profit. The injured worker benefits by easing back into the workforce, enjoys the camaraderie of a work environment, and gets the benefit of helping others. The employer retains a valuable employee, reduces indemnity spend, and can even get a tax deduction for wages paid to the employee while on assignment at the non-profit.

“Many employers have in-house programs,” said Debra Livingston, ReEmployAbility’s co-founder. “Transition2Work gives employers a flexible alternative that can fill in the gaps of their in-house programs, like it did for Waste Management. If you have one case or one hundred, we can place your injured worker in a suitable light duty assignment.

Partnering with nearly 20,000 nonprofit organizations across the nation, ReEmployAbility offers quick placement within 1.8 days on average, and turnkey RTW program services, including injured worker communication and complete claim documentation.

“We don’t offer a cookie cutter return to work program,” stated Frances Ford, ReEmployAbility’s co-founder. “After ten years of specializing in return to work services, our Transition2Work program fits the needs of every employer, and most importantly, every injured worker.”

Waste Management began the Transition2Work program with a pilot program earlier this year, and is now in the process of rolling out the program to its entire organization.

About ReEmployAbility:

Founded in 2003, ReEmployAbility is the largest national provider of early return-to-work (RTW) services and transitional employment programs. Our Transition2Work program offers employers a turnkey, cost-effective solution to modified light duty assignments, reducing claim costs while giving the injured worker time to heal. Utilizing our accredited, national network of nonprofit partners, we create innovative programs to help accommodate injured workers in the transition back to work. For more information, call 866-663-9880, visit www.ReEmployAbility.com or read our blog at www.transition2work.us.

Storage Switzerland Report Confirms Metalogix's End-to-End Data Migration and Archival Solutions Enable Organizations to Increase Efficiency and Reduce Costs of Unstructured Data Management


WASHINGTON, D.C.  - Metalogix, the leading provider of content infrastructure software to improve the use and performance of enterprise content on Microsoft SharePoint, Exchange, file and cloud platforms, today announced that leading analyst firm, Storage Switzerland, has recognized its  Metalogix Archive Manager Exchange Edition 6.0 and Metalogix Email Migrator 1.0 as a key vendor-agnostic solution for the seamless movement of data into and out of the cloud, as well as across heterogeneous onsite data center platforms. The Storage Switzerland research note, "Cloud Data Control - Metalogix," written by senior analyst, Colm Keegan, found that newly launched Exchange solutions from Metalogix help customers flexibly adapt to changing business/market conditions, while reducing complexity, mitigating risk and improving user productivity. 

"Many organizations are interested in reaping the benefits of ubiquitous, low cost public cloud storage but their genuine concerns over losing control of their data are holding many back from taking the plunge," said Keegan. "By using Metalogix data migration and archiving technology to move data into and out of cloud infrastructure, skittish business owners can alleviate their fears over lack of cloud control, while reaping cloud's numerous benefits."

With over 70 billion emails archived both on premise and in the cloud, the award winning Metalogix Archive Manager Exchange Edition provides immediate access straight from Outlook, OWA or mobile devices.  End users benefit from bottomless mailboxes that can be flexibly and securely accessed from anywhere in the world.  The Metalogix Email Migrator enables movement from platform-to-platform from legacy archives to live email systems.  Migrate from the ground to the cloud, cloud to cloud, and back again.  The Metalogix Email Migrator is built on decades of migration experience with over 50,000 Terabytes migrated across on-premise and cloud environments.

"Legacy email archives require costly maintenance and renewal commitments and are often slow performing, complex, difficult to manage and costly to upgrade.  As a result, many organizations are seeking streamlined methods to move off of their restrictive legacy email archives to benefit from the speed, efficiency and management simplicity of a dynamic archive optimized for on premises or cloud environments," said Steven Murphy, CEO, Metalogix. "Metalogix's email solutions were developed to help organizations escape email archive 'vendor lock in' with an easy approach to migrate their archived data to their supplier of choice based on business demands.  And then, to streamline management, while dramatically lowering costs."

Tweet this: @colmswiss Latest @StorageSwiss Briefing Note "Cloud Data Control - Metalogix"

Join the conversation:

- Twitter: https://twitter.com/Metalogix and https://twitter.com/MetalogixEmail

- LinkedIn:http://www.linkedin.com/company/metalogix-software   

About Metalogix

provides content infrastructure software to enhance the use and performance of enterprise content. For over a decade, Metalogix has transformed the way commercial and government organizations manage terabytes of content to improve knowledge sharing and collaboration. Today, more than 21,000 customers rely on the company's products to upgrade, migrate, organize, store, archive and replicate content on Microsoft SharePoint, Exchange and Cloud platforms. Metalogix is a privately held company backed by Insight Venture Partners and Bessemer Venture Partners.

NoVA Becomes the First Market to Initiate OIX Standardization Guidelines


  • Open-IX Association announces today the application process for OIX certification is now open in the Northern Virginia (NoVA) market.
  • “Industry executives and engineers alike are eager to standardize the way they operate in order to streamline the process and provide for efficiencies, resiliency and transparency across the global Internet,” comments Martin Hannigan, co-founder and Treasurer of Open-IX Association.
  • "We are a great supporter of the Open-IX initiative, as the principles of openness and community involvement have been at the core of LINX for the last 20 years. We are delighted to be the first IXP to apply for Open-IX certification,” comments John Souter, CEO of LINX.

CAMBRIDGE, Mass.– The Open-IX Association (OIX), a neutral, non-profit industry association formed to promote better standards for data center interconnection and Internet Exchanges in North America, announces the application process for OIX certification is now open in the Northern Virginia (NoVA) market.
The application process is the initial step to becoming an OIX-certified data center or Internet Exchange Point (IXP).  To initiate the process, applicants submit a deposit and processing fee along with their preferred designation as an Open IX data center or IXP.  After the application is accepted, data centers and IXPs are expected to review and comply with the OIX technical requirements by completing and submitting a detailed compliance report.  The application process is the first step required to become a technically certified OIX company and is different from basic membership in the association.  Membership is open to any company with an interest in Internet Exchange Points and data center interconnection and infrastructure interconnect engineering, or research, that wishes to participate in shaping the operations, engineering and future of interconnecting.

“The interest in OIX has been tremendous and companies are quickly seeking ways to adapt to the data center and IXP standards set forth by the association,” comments Martin Hannigan, co-founder and Treasurer of Open-IX Association.  “Industry executives and engineers alike are eager to standardize the way they operate in order to streamline the process and provide for efficiencies, resiliency and transparency across the global Internet.”

DuPont Fabros Technology has been supporting the Open-IX Association since its inception,” comments Vinay Nagpal, Director of Carrier Relations for DuPont Fabros Technology.  “We are strong proponents of open peering in North America as we believe it will benefit our customers and the Internet community at large.  We are delighted to be part of this process and look forward to an open platform to connectivity.”

"We are a great supporter of the Open-IX initiative, as the principles of openness and community involvement have been at the core of LINX for the last 20 years.  We are delighted to be the first IXP to apply for Open-IX certification,” comments John Souter, CEO of LINX.  "Our LINX NoVA exchange has been established not for commercial gain, but because our members and the community have indicated they want us to be there, in order to improve peering opportunities for networks in North America.  Improved peering ultimately brings benefits for North American businesses and citizens.  Open-IX embodies all the things that are for the good of the Internet and we would encourage network operators to co-locate in Open-IX certified data centers and to peer at Open-IX certified exchanges wherever possible."

OIX is a non-profit and neutral body of volunteers from the Internet community with the common goal of creating standards for Internet connectivity, resiliency, interconnect, security and cost.  OIX is seeking to help unify a highly fragmented industry and change the way networks connect with one another in North America by creating a new network of member-governed Internet Exchange Points (IXPs) housed in multiple neutral data center facilities that allow participants to interact and exchange content without the usual fiscal burden of commercial providers.

To apply for OIX certification in the initial, Northern Virginia, market, email info@open-ix.org.

About OIX
The Open-IX Association (OIX) is an Internet community-derived effort to improve the landscape of Internet peering and interconnect in the United States.  OIX encourages the development of neutral and distributed Internet Exchanges in North America while promoting uniform, cost-efficient standards of performance for interconnections backed by the Internet community.  The association aims to promote common and uniform specifications for data transfer and physical connectivity and improve IX performance by developing criteria and methods of measurement to reduce the complexity that restricts interconnection in fragmented markets.  The OIX Board is comprised of volunteer representatives from the Internet community in the United States, including Paul L. Andersen; Donald S. Clark; Dan Golding; Martin Hannigan; Keith Mitchell; David Temkin; and Barry Tishgart.  More information about OIX can be found by visiting www.open-ix.org.

About DuPont Fabros Technology, Inc.
DuPont Fabros Technology, Inc. (NYSE: DFT) is a leading owner, developer, operator and manager of enterprise-class, carrier-neutral, large multi-tenanted wholesale data centers.  The Company’s facilities are designed to offer highly specialized, efficient and safe computing environments in a low-cost operating model.  The Company’s customers outsource their mission critical applications and include national and international enterprises across numerous industries, such as technology, Internet content providers, media, communications, cloud-based, healthcare and financial services.  The Company's ten data centers are located in four major U.S. markets, which total 2.5 million gross square feet and 218 megawatts of available critical load to power the servers and computing equipment of its customers.  DuPont Fabros Technology, Inc., a real estate investment trust (REIT), is headquartered in Washington, DC.  For more information, please visit www.dft.com.

What issues and new technologies have disrupted the IT continuity landscape in 2013 and how are these likely to develop in 2014?

By Patrick Hubbard and Lawrence Garvin, SolarWinds.

We have spent the past year speaking with hundreds of techies at every major networking trade event in 2013 and from these discussions have drawn a number of predictions for the coming year, as well as insights into how the industry has evolved and developed over the past twelve months. Below, we share our thoughts on the past year and our predictions for 2014.

2013 has been the year of vendor-led hype on buzz technologies such as SDN and cloud, but in practice very few notable advances in technologies or vendor offerings in these areas have come into fruition.

Cross-product support, and a noticeable increase in budget, has accelerated the advance of virtualization. Products such as Cisco Unified Computing System (UCS) have made it possible to integrate with VMware V-block, boosting the desktop virtualization trend and widened its reach into mid-market networks. Similarly, with the launch of Hyper V, 2013 was the year that Microsoft finally became a genuine player in the virtualization space.