WANdisco Fusion Now Available With Bridgeworks PORTrockIT for Best-of-Breed Cloud Migration and Hybrid Cloud Deployments
SAN RAMON, Calif. – WANdisco (LSE: WAND) - the leading provider of software that enables global enterprises to meet today's data challenges of secure storage, scalability and availability - today announced a new partnership with WAN acceleration vendor Bridgeworks, recognized as a Gartner Cool Vendor in Enterprise Networking for 2016.
Bridgeworks PORTrockIT software will now be available for WANdisco Fusion's patented active-transactional replication technology, that was built to move transactional data to the cloud, at Petabyte scale, without downtime or data loss. PORTrockIT's patented technology supports businesses that rely on fast movement of large volumes of data and increases performance by up to 100 times faster. This addition of speed delivers enterprise data resilience and security for critical applications such as backup, replication, disaster recovery and cloud migration.
"This new partnership brings together two best-of-breed solutions to unlock the full potential of the cloud," said David Richards, WANdisco CEO and Co-Founder."WANdisco's continuous availability and guaranteed data consistency can now be deployed with software from Bridgeworks that virtually removes the effects of network latency, making it possible for on-premise and cloud environments to operate as one."
The combined Bridgeworks and WANdisco technology will ensure that large volumes of live production data can move to and from the cloud without any business disruption. This further supports hybrid cloud requirements for on-demand, burst-out processing, and offsite disaster recovery, without downtime or data loss. The combined solution eliminates the need for cloud vendor storage appliances to be sent back and forth from customer data centers in a process that involves days of downtime and is only suitable for one-time movement of cold, less critical data.
"Enterprises are increasingly reliant on the need to move large volumes of data to where it needs to be at speed. They also need to have full trust and confidence that their infrastructure is able to deliver the performance that is critical to their business," said David Trossell, CEO of Bridgeworks.
"WANdisco software is already known for its performance and it has revolutionized the global enterprise requirement to migrate large amounts of transactional data to the Cloud. When combined with Bridgeworks PORTrockIT technology, the resulting increase of up to 100 times in transfer speed has raised the bar for business.
The combination of WANdisco and Bridgeworks will ensure a flexible and scalable solution for the increasing number of Enterprises looking to the Cloud as a viable option for running business-critical applications that rely on speed of data migration."
WANdisco (LSE: WAND) is a provider of enterprise-ready, non-stop software solutions that enable globally distributed organizations to meet today's data challenges of secure storage, scalability and availability. WANdisco's products are differentiated by the company's patented, active-transactional data replication technology, serving crucial high availability (HA) requirements, including Hadoop Big Data and Application Lifecycle Management (ALM), including Apache Subversion and Git. Fortune Global 1000 companies, including Juniper Networks, Motorola, and Halliburton, rely on WANdisco for performance, reliability, security and availability. For additional information, please visit http://www.wandisco.com.
Bridgeworks enables business to work smarter, solving the universal problem of data movement over distance for backup, replication, disaster recovery, migration or indeed any vital data movement application in a hybrid world. Additional information about Bridgeworks can be found at www.4bridgeworks.com.
Apache Hadoop and Subversion are trademarks of the Apache Software Foundation (ASF). All other product and company names herein may be trademarks of their registered owners.
The Solution, Now Generally Available, Streamlines Complex CI Deployments for VARs and Reduces Configuration Time From 70+ Hours to 60 Minutes
ATLANTA, Ga. – StrataCloud, a software-defined infrastructure solutions provider, today announces the general availability of StrataCloud SDI Install, a software application that automates the configuration of converged infrastructure (CI) systems. The first software to be released on the StrataCloud SDI platform, SDI Install is available today for value-added resellers (VARs) and systems integrators that wish to speed deployment and reduce the complexity of installing CI systems. SDI Install initially supports the popular FlexPod product lines from Cisco and NetApp, with future plans to support other leading CI systems.
Converged infrastructure solutions combine the core data center components of servers, storage and networking to simplify IT infrastructure expansion and management. Many companies moving to a software-defined infrastructure are adopting converged infrastructure, given its promise for higher levels of efficiency, performance and flexibility in delivering IT services to the business. Yet the VARs tasked with installing CI systems encounter challenges during deployment; configuring the systems requires specialized IT architecture skills, as well as individual expertise in each of the hardware components. The time-consuming process requires the use of multiple console interfaces and expensive personnel.
StrataCloud SDI Install gives the VAR a single interface for FlexPod design and deployment, with a decision tree that guides the architect through the design process and then creates a blueprint to guide installation. Once a blueprint is created, it can be accessed and reused to launch unlimited future installations, providing a consistent, repeatable process. At the customer site, the installer accesses the blueprint to launch automatic configuration and validation of the hardware in the customer's environment. By automating complex manual configuration processes, SDI Install enables a single individual to deploy a CI system from start to finish. In testing, SDI Install has been shown to speed the configuration of FlexPod from 70+ hours to 60 minutes.
"SDI Install is a game changer for converged infrastructure, as it significantly reduces the number of manual configurations that are required to get a FlexPod up and running," says Brian Cohen, CEO of StrataCloud. "Our solution provides a streamlined out-of-the-box experience for deploying FlexPod that is fast, easy and consistent. With SDI Install, VARs spend less time installing and more time addressing customer business outcomes."
About the StrataCloud SDI Platform
SDI Install is the first product built on StrataCloud SDI, the platform for software-defined infrastructure. By producing a logical model of infrastructure, StrataCloud SDI creates order from chaos in the data center. Through the use of an extensible object graph, StrataCloud SDI can normalize any infrastructure hardware, understand how infrastructure systems are connected, abstract the functionality of disparate APIs and protocols, and read and write to infrastructure in a consistent manner from a single interface. The flexible, scalable platform is built from the ground up on a unified architecture that can support vendor-agnostic infrastructure configuration, as well as enterprise-grade support throughout the infrastructure lifecycle.
Headquartered in Atlanta, StrataCloud is a leading provider of software defined infrastructure solutions. The StrataCloud SDI platform will provide a toolkit that helps IT teams streamline infrastructure operations and deliver public cloud agility from a private data center. Learn more at www.stratacloud.com.
Global voice cloud provider to offer first-ever self-service telephony within Salesforce
Eliminate Media Web Servers, Media Access Applications, Relational Databases and File Systems by Integrating a Single Tier of Scalable Storage AUSTIN, Texas – Content distribution sites derive value from how quickly and easily they are able to deliver digital assets to their audiences. To ensure the best end-user experience, many content providers have implemented a secondary set of media access infrastructures to support their rapidly scaling sites, including additional media web servers, media access applications, relational databases and distributed file systems to deliver, access and store content. A better solution to adding multiple tiers of infrastructure to purchase, deploy, scale and manage is to consolidate on a single, scalable tier of web-accessible storage, say experts at Caringo. As content libraries grow, traditional technologies like relational databases and file systems become difficult to manage and protect. The cost of hardware and staff required to manage these disparate systems ultimately limits operational flexibility while the addition of each layer of infrastructure results in compounding latency—resulting in increased buffering, slow content delivery and, ultimately, lost viewers. The solution is consolidating the media web server, media access application and relational database tier with searchable storage for the cloud age enabled by Caringo Swarm. “Content distributors looking to reduce latency by 40%, reduce storage costs by 75% and radically simplify content delivery, access from applications and content management should look no further than Caringo Swarm,” said Adrian Herrera, Caringo Vice President of Marketing. “This is the reason Caringo is used as the back end for major media properties owned by NEP, IAC and various cultural media archives worldwide.” Offered as a complete software appliance, Swarm provides a storage platform for data protection, management, organization and search at massive scale. Users no longer need to migrate data into disparate solutions for long-term preservation, delivery and analysis. Organizations can easily consolidate all files on Swarm, find the data they are looking for quickly, and reduce total cost of ownership by continuously evolving hardware and optimizing use of their resources. For more information on how Caringo Swarm simplifies and streamlines content delivery visit https://caringo.wistia.com/medias/nryrwg3p10. Follow Caringo LinkedIn: https://www.linkedin.com/company/caringo-inc- Twitter: https://twitter.com/CaringoStorage About Caringo Caringo was founded in 2005 to change the economics of storage by designing software from the ground up to solve the issues associated with data protection, management, organization and search at massive scale. Caringo’s flagship product, Swarm, eliminates the need to migrate data into disparate solutions for long-term preservation, delivery and analysis—radically reducing total cost of ownership. Today, Caringo software is the foundation for simple, bulletproof, limitless storage solutions for the Department of Defense, the Brazilian Federal Court System, City of Austin, Telefónica, British Telecom, Ask.com, Johns Hopkins University and hundreds more worldwide. Visit www.caringo.com to learn more.
We’re used to hearing that security is the biggest bugaboo holding back greater migration to the cloud. Internet security concerns are said to be so acute that it’s widely accepted as an axiomatic truth.
But it’s time to revise that argument.
Digital security still rates as an important issue in any discussion about whether to migrate an enterprise’s data to the cloud. But enterprises have warmed up to cloud computing to the point where their biggest challenge now is actually finding enough people who have the necessary technical backgrounds to keep their cloud systems up and running.
Though building codes for schools and a range of other structures provide for protection of winds up to 115 mph, that’s not nearly enough to protect against a strong tornado like an EF4, an EF5 or even an EF3. In fact, building codes don’t even mention tornadoes unless discussing a safe room or shelter.
That has to change, and building codes and standards need to acknowledge tornadoes and the difference between straight speeds and the variables of wind presented by tornadoes. That is one of the 16 recommendations that resulted from a National Institute of Standards and Technology (NIST) study of the May 2011 tornado that killed 161 and damaged more than 7,500 structures in Joplin, Mo.
The tornado was the deadliest since the first records were kept in 1951, hence the study to determine what factors contributed most to the death and destruction. The NIST team, led by Marc Levitan, looked at four key factors that contributed: storm characteristics; building performance; human behavior; and emergency communication.
Eric Bassier is Senior Director of Datacenter Solutions at Quantum.
It already reached 90 degrees in Seattle this year. In April. I’m not complaining – yet – but I’m definitely a believer that global warming is happening and that we need to make some changes to address it. But this article isn’t about climate change – it’s about data. Specifically, it’s about the growth of unstructured data and the gloomy fate ahead if we continue to deny the problem and ignore the warning signs. Sound familiar?
It’s hard to argue with the evidence of unstructured data growth. Estimates and studies vary, but the general consensus is that there will be 40-50 zettabytes of data by the year 2020, and 80-90 percent of that will be unstructured.
If all things were equal between the private and public cloud, few enterprises would migrate their workloads to public infrastructure. All things are not equal, however, so IT executives are constantly weighing the security and availability concerns of the public cloud with higher capital costs and lack of scale on the private side.
But while public providers have made a lot of noise touting their improved encryption and service reliability, an equally strong movement is brewing to make private cloud infrastructure more scalable, easier to deploy and less expensive.
The private cloud requires private infrastructure, of course, so deploying resources at scale remains a key challenge. (Yes, hosted is an option, too, but I’m talking about true in-house private clouds.) This is why emerging platform providers like Tintri are pushing the envelope when it comes to deploying hefty resource architectures without crushing the budget. The company’s new VMstore T5000 All-Flash Series appliance supports upwards of 160,000 virtual machines and can be outfitted with SaaS-based predictive analytics and other tools to enable advanced capacity and performance models to suit Apache Spark, ElasticSearch and other Big Data engines. And as is the company’s modus operandi, the system scales at the VM level rather than the LUN level to enable greater flexibility when matching resources to workloads.
(TNS) - With one expert calling Zika the “the virus from hell,” health officials warned state lawmakers about the spread of the Zika virus across the state and offered their insights on possible response measures in case of an outbreak.
John Hellerstedt, commissioner for the Department of State Health Services, warned that the virus is expected to begin spreading as prime mosquito season nears.
“We don’t know when and we don’t know at what level that will occur,” Hellerstedt said.
In response to the growing number of Zika cases in Texas in recent months, Tuesday afternoon lawmakers met to discuss what is being done in the state to prevent an outbreak of the virus.
Considering the number of threats that organizations face today, it may be surprising to learn that the majority of companies are not prepared for a business-affecting emergency. Unfortunately, it’s true: The Disaster Recovery Preparedness Council found that nearly three quarters of organizations worldwide aren’t properly protecting their data and systems.
The potential consequences of not having a business continuity management program are extremely grave. Consider the many risks that your company faces: network outages, natural disasters, active shooter events, data breaches and more. However, if your organization doesn’t take business continuity seriously, you’re facing even greater risks, including the following: