Clients Will Receive Extra Incentives to Migrate to Robust Cloud Infrastructures
New York — Telx®, a leading provider of global interconnectivity, cloud enablement services and datacenter solutions together with Peak® (formerly PeakColo), an enterprise-class IaaS Cloud provider, today announced two cloud nodes are now online and ready for customers allowing direct access to Telx Enabled Infrastructure-as-a-Service (IaaS) cloud powered by Peak. The new core nodes reside in two Telx data centers, SCL2, a 40,000 square foot data center at 2820 Northwestern Parkway in Santa Clara, California and in ATL1, a 160,000 square foot data center located at 56 Marietta Street in Atlanta, Ga.
To celebrate the opening of these new cloud nodes, the companies are offering special incentives to clients who establish new cloud service including hybrid configurations, which leverage both Telx colocation and a robust cloud platform powered by Peak. Clients looking to change providers can easily migrate to this secure cloud platform with a simplified plan that includes:
- Free data transfer to the Telx Enabled Cloud, Powered by Peak
- Free bandwidth [10 Gbps] until all customer data is seeded in Peak's cloud
- Free engineering assistance for migration planning
- This plan is available until June 30, 2014
“We wanted to provide easy ways for clients to utilize cloud computing and with our patented Layer 2 Peak to Peak Direct Connect network technology, both Telx customers and Peak partners can quickly migrate full production and backup environments directly to the cloud,” stated Dave Woodward, Senior Vice President of Sales for Peak. “The marketplace has positively responded to the enablement of Peak’s cloud within Telx facilities. We look forward to bringing more cloud nodes online in Telx facilities throughout the coming year.”
Peak currently has cloud nodes in 11 unique geographies across the U.S. and Europe offering partners, VARs and distributors flexible ways to connect interchangeably.
“Infrastructure as a Service continues to be a strong growth area for cloud computing because it provides businesses of all sizes with enhanced capabilities and direct financial benefits,” said Tim FitzGerald, vice president of Avnet Cloud Solutions, Avnet Technology Solutions, Americas “The availability of Peak’s new cloud nodes through Telx provides Avnet’s channel partners in the U.S. and Canada with expanded opportunities to help their customers deploy secure cloud platforms. Our channel partners will be able to leverage these new nodes to address the IT infrastructure challenges of their customers related to business continuity, disaster recovery, storage and networking.”
“Telx enabled cloud services powered by Peak allows clients to access a customizable set of cloud services, managed storage and backup services, while leveraging Telx’s expansive access to leading carriers and global networks,” said Les Williams, Director, Strategic Business Development, Telx. “Partnering with Peak in our strategically located data centers in Santa Clara and Atlanta is just the beginning in our relationship. We are excited about the prospect of expanding our partnership with Peak across any of Telx’s 20 data centers coast to coast as the market needs arise. Telx is servicing some of the world’s largest network providers and enterprises across a wide range of business verticals and this partnership allows us to respond to increased customer demand for flexible cloud solutions.”
Peak® is an enterprise-class Infrastructure-as-a-Service (IaaS) cloud service provider to channel partners. White-labeling Peak’s cloud services as their own, resellers and agents rapidly enter the cloud marketplace under their own brand without capital expenditure, enjoying a faster route to profitability. Peak operates Type II SSAE 16 and SOC 1 & 2 compliant cloud nodes in eight geographies across the United States and in Europe (Silicon Valley, Seattle, Denver, Chicago, New Jersey, New York, Atlanta, and the United Kingdom). Its VMware vCloud® Powered cloud environment contains tens of thousands of virtual machines and multiple petabytes of storage for public, private, hybrid and disaster recovery solutions. Peak offers both Cisco UCS and Open Compute platforms, and is a Platinum-level NetApp Service Provider. For more information, visit www.poweredbypeak.com call (855) 532-4734; or follow us on Twitter or LinkedIn.
Telx is the leading domestic provider of colocation, data center support and services within 13 strategic North American markets. With an industry leading 100% Uptime & 100% “On-Time” Service Delivery SLAs, Telx clients build more agile businesses faster with reduced infrastructure complexity and broader reach to new markets. Its 1,200+ clients leverage a carrier neutral 24 Hour Cross Connect guarantee to interconnect to a vast base of high performance global networks.
Telx is a privately held company, servicing all industry types with dual headquarters in New York and San Francisco. Telx has a total of 20 strategic data centers which include six facilities across the New York / New Jersey Metro area, two facilities in Chicago, two facilities in Dallas, four facilities in California (Los Angeles, San Francisco, and two in Santa Clara), two Pacific Northwest facilities (Seattle and Portland), as well as facilities in Atlanta, Miami, Phoenix and Charlotte, N.C.
SAN FRANCISCO, MARCH 14, 2014 – IObit the expert in system utility and optimization, today is expected to announce that their popular driver updating tool Driver Booster will continue support for Windows XP after Microsoft shuts it down in April 2014. The recent released Driver Booster v1.3 is improved to provide better services for Windows XP users to get the right and latest drivers to enhance the system stability, PC performance, or even system security.
After April, it could be frustrated for Windows XP users to find and install the right drivers for their old Windows XP PC. Without up-to-date drivers, system cannot run at its peak performance and might even suffer problems, which can lead to system slow down, frozen, even system crashes. Sometimes, some vulnerabilities and security holes left by obsolete drivers might induce unexpected problems.
With giant database and cloud library, Driver Booster can quickly detect the outdated driver, then find and easily install the appropriate drivers according to PC requirements. What makes Driver Booster different is that it’s special designed for gamers to enhance their gaming experience with the appropriate drivers for various graphic cards. Comparing to pure-manual operation, Driver Booster’s process is greatly simplified because of its one-click design. In IObit’s plan, its database will be continuously and timely expanded to deal with the potential problem caused by the termination of support for Windows XP.
“No matter which Windows operating system users are using, regular check on drivers is always an essential and necessary way of PC maintenance,” said Antonio Zhang, the Marketing Director at IObit. “We found that about 60% of Windows XP users want to keep using Windows XP in our survey. We want to let them know what will happen and to get through this tough period together with them. If needed, our support will be permanent.”
Now Driver Booster v1.3 is available on IObit.com and Download.com. For Windows XP users, a big giveaway of Driver Booster will be held on IObit Facebook Fan Page on March 20. Besides Driver Booster, another 5 IObit’s products will continue support for Windows XP. For details, please follow IObit.com and IObit Facebook Page.
About Smart Defrag 3
Driver Booster scans and identifies outdated drivers automatically, and downloads and installs the right update with just ONE click. It's the right tool to protect PC from hardware failures, conflicts, and system crashes. To download the program, please visit: http://www.iobit.com/driver-booster.php
Founded in 2004, IObit provides consumers with innovative system utilities for Windows, Mac, and Android OS to greatly enhance their performance and protect them from security threats. IObit is a well-recognized industry leader with more than 100 awards, 200 million downloads and 10 million active users worldwide.
The recent flooding episode has highlighted shortcomings in the UK government’s approach to risk events says Chairman of the Institute of Risk Management, Richard Anderson.
“The terrible flooding in Somerset and the Thames has brought into sharp focus the ‘fingers crossed’ and ‘touching wood’ approach to risk management strategy that is so often adopted by government. It is regrettable that this seems to be the default mechanism to approaching all manner of risks. It is an appalling state of affairs because we understand how to manage risk better now than we ever have in the past. Since the flooding we have seen lots of frenetic activity from government officials which is unproductive and the government would be better served by seeking the advice of the increasing cadre of expert risk professionals who are largely being ignored at the moment.
“Routine risk thinking tends to be handled at a very junior level in government. Much of it is no more than painting by numbers as committees consider whether a risk should be red, amber or green. Most risks are considered in isolation of other risks materialising. That is not what happens in real life: in real life as one thing hits, another does straight after, and another and another. The interdependence of multiple impact risks needs to be managed far more professionally.
The US National Institute of Standards and Technology (NIST) will host the first of six workshops devoted to developing a comprehensive, community-based disaster resilience framework, a national initiative carried out under the President's Climate Action Plan. The workshop will be held at the NIST laboratories in Gaithersburg, Md., on Monday, April 7, 2014.
Focusing on buildings and critical infrastructure, the planned framework will aid communities in efforts to protect people and property and to recover more rapidly from natural and man-made disasters. Hurricanes Katrina and Sandy, and other recent disasters, have highlighted the interconnected nature of buildings and infrastructure systems and their vulnerabilities.
The six workshops will focus on the roles that buildings and infrastructure systems play in ensuring community resilience. NIST will use workshop inputs as it drafts the disaster resilience framework. To be released for public comment in April 2015, the framework will establish overall performance goals; assess existing standards, codes, and practices; and identify gaps that must be addressed to bolster community resilience.
NIST seeks input from a broad array of stakeholders, including planners, designers, facility owners and users, government officials, utility owners, regulators, standards and model code developers, insurers, trade and professional associations, disaster response and recovery groups, and researchers.
All workshops will focus on resilience needs, which, in part, will reflect hazard risks common to geographic regions.
The NIST-hosted event will begin at 8 a.m. and is open to all interested parties. The registration fee for the inaugural workshop is $55. Space is limited. To learn more and to register, go to: www.nist.gov/el/building_materials/resilience/disreswksp.cfm.
Registration closes on March 31, 2014.
More information on the disaster resilience framework can be found at www.nist.gov/el/building_materials/resilience/framework.cfm
The UN Office for Disaster Risk Reduction (UNISDR) is working with IBM and AECOM to measure cities’ resilience to disasters.
The first output of the partnership is a Disaster Resilience Scorecard created for use by members of UNISDR’s ‘Making Cities Resilient’ campaign which has been running now for almost four years.
The scorecard is based on the Campaign’s Ten Essentials – UNISDR’s list of top priorities for building urban resilience to disasters — and has been developed by IBM and AECOM. A list of potential cities is being developed to test the scorecard and to support their disaster resilience planning.
The Disaster Resilience Scorecard reviews policy and planning, engineering, informational, organizational, financial, social and environmental aspects of disaster resilience. Each of the criteria has a measurement scale of 0 to 5, whereby 5 is regarded as ‘good practice.’
The scorecard will be available at no cost through UNISDR, AECOM or IBM.
Both IBM and AECOM are part of UNISDR’s Private Sector Advisory Group and the Making Cities Resilient Steering Committee.
CSO — Healthcare organizations see an expanding landscape of uncertainty that has raised concerns among security pros and points to the need for more thorough threat analyses, a study showed.
Risks posed by health insurance and information exchanges, employee negligence, cloud services and mobile device usage has dampened confidence in protecting patient data, the Fourth Annual Benchmark Study on Patient Privacy & Data Security found. The study, released Wednesday, was conducted by the Ponemon Institute and sponsored by data breach prevention company ID Experts.
Despite the concerns, the study showed progress on the security front. The average cost of data breaches for organizations represented in the study fell to $2 million over a two-year period, compared to $2.4 million in last year's report.
Data deduplication or the elimination of repetition of data to save storage space and speed transmission over the network – sounds good, right? ‘Data deduping’ is currently in the spotlight as a technique to help organisations boost efficiency and save money, although it’s not new. PC utilities like WinZip have been compressing files for some time. The new angle is doing this systematically across vast swathes of data. By reducing the storage volume required, enterprises may be able to keep more data on disk or even in flash memory, rather than in tape archives. Vendor estimates indicate customers might store up to 30 terabytes of digital data in a physical space of just one terabyte.
In my previous post, I shared the ongoing debate about the most effective way to approach Big Data so that it will yield meaningful, useful and, hopefully, profitable findings.
The top two options are approaching data as an explorer versus Tom Davenport’s contention that you need to use a hypothesis, which I translate as using a more scientific-method based approach.
Explorer advocates say Big Data is too big for the typical reports-driven approach, and what’s worked for early adopters has been tinkering with the data to see what it reveals. Davenport and others contend that is a great way to waste time, spend money and create unhappy business leaders.
According to a recent Entrepreneur article, small businesses should find effective ways to analyze data in order to give their customers what they need without pushing too hard to gather more data from those same customers. Sounds simple, yet complex.
And when you also consider that data is increasing exponentially, and the way Big Data has been multiplying, it’s no wonder small to midsize businesses (SMBs) have become quite overwhelmed about how to collect, sort, and use Big Data in any effective manner.
But what SMBs need to realize is that the key to using data is “refinement.” In his Entrepreneur article, Suhail Doshi explains:
Archiving has always been one of those functions that pulls the enterprise in two different directions. Increased data volumes, of course, require more storage capacity, but as data sits in the archives for longer periods of time, it loses its value. So in the end, the enterprise must devote more resources to constantly diminishing assets.
Of course, this is the lifeblood of the archival management industry as numerous companies work up sophisticated algorithms and other tools to analyze data and then shift it from one set of resources to another based on its intrinsic value. The real purpose behind Big Data management, after all, is not to accommodate increasing volumes but to mine existing stores for gold and then store the rest at the lowest possible cost—or discard it altogether.
Naturally, part of this process requires the development of low-cost media, such as tape, which offers the benefit of stable, long-term storage for data that is accessed infrequently. Disk-based archiving is also gaining in popularity, although this is primarily in tiered solutions, considering the disk’s relatively weak long-term reliability.