Spring World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Bonus Journal

Volume 29, Issue 5

Full Contents Now Available!

Jon Seals

Jon Seals

Global voice cloud provider to offer first-ever self-service telephony within Salesforce
LONDON -  Natterbox, the UK-based global voice cloud services provider today introduced Natterbox VoiceCloud PBX for Salesforce®, the first complete global telephony system that is managed from within Salesforce. The global service introduces a new era in self-service telephony – landline and mobile – for the world’s leading enterprise cloud ecosystem.
With this introduction, Natterbox is pioneering embedding Cloud PBX management into Salesforce.  This means that Salesforce administrators and managers can now deploy and manage their global telephony system, including landline and mobile, instead of relying on IT or telecoms staff. 
Natterbox CEO Neil Hammerton said, “The migration of business telephony to a cloud model is well accepted.  This is driving a new breed of telephony that is agile, flexible and helps organizations innovate to deliver a better customer experience and staff productivity.”  He added, “The next stage of cloud telephony is self-service.  Amazon Web Services (AWS) showed the world how cloud technology self-service drives disruptive innovation.  We are bringing this approach to Salesforce telephony over our global network.”
Natterbox VoiceCloud for Salesforce brings telephony management directly to sales, service, marketing and other functions. For example, a customer service manager who wants to handle inbound calls from out-of-service customers can route those calls to a highly skilled resolution team.   A new personalized IVR response delivers a better customer experience and highlights their importance as customers. Innovation like this can be activated within minutes instead of having to wait for expensive telecoms staff or external resources to implement.
As well as management within Salesforce, Natterbox delivers all the service components required for a global cloud telephony system with Salesforce integration. Historically this has been a difficult and complex process requiring multiple suppliers and creating significant management, compatibility and support costs.  According to Hammerton, “Natterbox is both a full global telephony provider and Salesforce integration supplier.  We are a single source and integrator of all domestic or global telephony components which we deliver as a cloud service, from the PBX, numbers, lines and phones through to complete Salesforce integration which even extends to enterprise call recording from mobile or landline phones.“
Natterbox VoiceCloud PBX for Salesforce is available now in BETA and full release will be in Q3 2016.
Natterbox CEO Neil Hammerton is available for introductory media discussions via video call. He spoke earlier this week at Salesforce1 in London, discussing The Future of Salesforce Telephony.
About Natterbox
Natterbox launched in 2010 to solve business telephony issues and bring voice into the digitized customer experience through a global cloud PBX service that captures and integrates voice into customer processes and Salesforce® systems. Over 500 organizations around the world rely on Natterbox to set new standards in customer experience to drive measurable increases in sales efficiency, competitive advantage and organizational success. Customers include Groupon, Kimberly Clarke, Rakuten, Legal & General.
Eliminate Media Web Servers, Media Access Applications, Relational Databases and File Systems by Integrating a Single Tier of Scalable Storage

AUSTIN, Texas – Content distribution sites derive value from how quickly and easily they are able to deliver digital assets to their audiences. To ensure the best end-user experience, many content providers have implemented a secondary set of media access infrastructures to support their rapidly scaling sites, including additional media web servers, media access applications, relational databases and distributed file systems to deliver, access and store content. A better solution to adding multiple tiers of infrastructure to purchase, deploy, scale and manage is to consolidate on a single, scalable tier of web-accessible storage, say experts at Caringo.

As content libraries grow, traditional technologies like relational databases and file systems become difficult to manage and protect. The cost of hardware and staff required to manage these disparate systems ultimately limits operational flexibility while the addition of each layer of infrastructure results in compounding latency—resulting in increased buffering, slow content delivery and, ultimately, lost viewers. The solution is consolidating the media web server, media access application and relational database tier with searchable storage for the cloud age enabled by Caringo Swarm.

“Content distributors looking to reduce latency by 40%, reduce storage costs by 75% and radically simplify content delivery, access from applications and content management should look no further than Caringo Swarm,” said Adrian Herrera, Caringo Vice President of Marketing. “This is the reason Caringo is used as the back end for major media properties owned by NEP, IAC and various cultural media archives worldwide.”

Offered as a complete software appliance, Swarm provides a storage platform for data protection, management, organization and search at massive scale. Users no longer need to migrate data into disparate solutions for long-term preservation, delivery and analysis. Organizations can easily consolidate all files on Swarm, find the data they are looking for quickly, and reduce total cost of ownership by continuously evolving hardware and optimizing use of their resources.

For more information on how Caringo Swarm simplifies and streamlines content delivery visit https://caringo.wistia.com/medias/nryrwg3p10.

Follow Caringo

LinkedIn: https://www.linkedin.com/company/caringo-inc-

Twitter: https://twitter.com/CaringoStorage

About Caringo

Caringo was founded in 2005 to change the economics of storage by designing software from the ground up to solve the issues associated with data protection, management, organization and search at massive scale. Caringo’s flagship product, Swarm, eliminates the need to migrate data into disparate solutions for long-term preservation, delivery and analysis—radically reducing total cost of ownership. Today, Caringo software is the foundation for simple, bulletproof, limitless storage solutions for the Department of Defense, the Brazilian Federal Court System, City of Austin, Telefónica, British Telecom, Ask.com, Johns Hopkins University and hundreds more worldwide. Visit www.caringo.com to learn more.

We’re used to hearing that security is the biggest bugaboo holding back greater migration to the cloud. Internet security concerns are said to be so acute that it’s widely accepted as an axiomatic truth.

But it’s time to revise that argument.

Digital security still rates as an important issue in any discussion about whether to migrate an enterprise’s data to the cloud. But enterprises have warmed up to cloud computing to the point where their biggest challenge now is actually finding enough people who have the necessary technical backgrounds to keep their cloud systems up and running.



Thursday, 19 May 2016 00:00

Joplin Study Spawns Code Recommendations

Though building codes for schools and a range of other structures provide for protection of winds up to 115 mph, that’s not nearly enough to protect against a strong tornado like an EF4, an EF5 or even an EF3. In fact, building codes don’t even mention tornadoes unless discussing a safe room or shelter.

That has to change, and building codes and standards need to acknowledge tornadoes and the difference between straight speeds and the variables of wind presented by tornadoes. That is one of the 16 recommendations that resulted from a National Institute of Standards and Technology (NIST) study of the May 2011 tornado that killed 161 and damaged more than 7,500 structures in Joplin, Mo.

The tornado was the deadliest since the first records were kept in 1951, hence the study to determine what factors contributed most to the death and destruction. The NIST team, led by Marc Levitan, looked at four key factors that contributed: storm characteristics; building performance; human behavior; and emergency communication.



Thursday, 19 May 2016 00:00

Global Warming of Data

Eric Bassier is Senior Director of Datacenter Solutions at Quantum.

It already reached 90 degrees in Seattle this year. In April. I’m not complaining – yet – but I’m definitely a believer that global warming is happening and that we need to make some changes to address it. But this article isn’t about climate change – it’s about data. Specifically, it’s about the growth of unstructured data and the gloomy fate ahead if we continue to deny the problem and ignore the warning signs. Sound familiar?

It’s hard to argue with the evidence of unstructured data growth. Estimates and studies vary, but the general consensus is that there will be 40-50 zettabytes of data by the year 2020, and 80-90 percent of that will be unstructured.