According to the latest predictions from analyst firm IDC, “more than 80% of enterprise IT organizations will commit to hybrid cloud by 2017.” That means that your organization is likely to evaluate an Infrastructure-as-a-Service (IaaS) solutions this year, if you haven’t chosen one already. As you consider options, it can be difficult to evaluate the different management platforms and sort through the vendor claims. A team of technical experts developed a list of evaluation criteria to make it easier. They have recently published a white paper that provides a clear comparison between Cisco UCS Director and HPE OneView. The paper looks at three critical areas of IaaS functionality:
- Orchestration and automation
- Self-service provisioning
- Heterogeneous provisioning and management
A concise side-by-side comparison is provided in a table on page 5 of the document with details provided in the other sections of the paper.
QTS Realty Trust has been one of the fastest-growing publicly traded data center REITs since its 2013 IPO, and its shares returned more than 80 percent price appreciation to shareholders for the last two years.
Can the company maintain this momentum going into 2016? That’s the question we asked its CIO Jeff Berson and COO Dan Bennewitz in a recent interview.
Last week, JP Morgan selected QTS as one of two data center REITs with an Overweight rating, along with sector peer CyrusOne.
When I told you in my previous email that the only way to successfully erase a file is to COMPLETELY overwrite it, I wasn’t just trying to be dramatic. A few months ago, my friend had mistakenly deleted some photos from her SD card, so I encouraged her to try out some data recovery software. She was very surprised to find not only the pictures that she’d deleted, but also some very old ones — including her parents’ holiday pictures from when they used the SD card with their own camera.
I mentioned before that when a file is deleted, the physical slot in which it is stored becomes free, and new data can be saved there. So it might be tempting to leave things to run their course and wait for the file to be overwritten by another. Don’t give in to that temptation — waiting is not enough. Here’s why:
If you use a cloud service or let your employees access company systems from their own smartphones, you’ve probably already noticed how your IT security world has expanded. What used to be a tightly defined domain behind a firewall has morphed into something that now extends to the far confines of cyberspace. As a matter of principle, any business data that travels outside the company perimeter is automatically at greater risk, even if enterprises make great efforts to keep the risk delta as small as possible. However, the macro style solution of a bigger firewall no longer works when you have to deal with the Internet at large. Micro-oriented approaches offer another approach.
In essence, the idea is to equip each piece of data, each application, each system and each user with the security required to function autonomously and securely, whether inside or outside the traditional IT security perimeters. Instead of an external blanket approach to try to shield everything from harm, security is built in from the inside towards the outside.
The problem of e-waste, which has been growing for decades, shows no signs of receding in terms of the amount of retired products that are produced. The good news, however, is that the current focus on environmental issues appears to be creating an atmosphere in which more substantial actions are possible.
Curbed lays out the e-waste problem, which is pretty straightforward: People buy huge amounts of electronic equipment. Those numbers continue to grow. Two things are true of that equipment: Only a small portion gets recycled or carefully destroyed when its useful life ends and the vast majority of the equipment contains dangerous elements.
The numbers are staggering:
More than eighty percent of enterprises plan to adopt OpenStack as a cloud computing solution or already have. Yet, half of organizations that have tried to implement it have failed, hampered by lack of open source cloud computing skills. That’s according to a survey out this week from SUSE, the Linux vendor, which sheds vital light on current OpenStack adoption trends.
The survey results suggest strong enthusiasm for open source cloud computing, with ninety-six percent of respondents reporting they “believe there are business advantages to implementing an open source private cloud,” according to SUSE.
Strong interest in private clouds of the type OpenStack enables is also clear. Ninety percent of businesses surveyed have already implemented at least one private cloud, SUSE reported.
The closer the enterprise gets to implementing Big Data analytics, the more daunting it appears. Even organizations that are well-versed in data warehousing realize that building infrastructure for the so-called “data lake” is a completely different ballgame.
Not only does the data lake require large amounts of computing power and storage access, it has to be integrated with cutting-edge analytics, automation, orchestration and machine intelligence. And ideally, this state-of-the-art infrastructure should be accessible to the average business executive who has little or no experience in the data sciences.
But as we’ve seen many times, things that seem impossible at the outset are often possible once you put your mind to it. And data lake technology is already starting to make its mark at the top end of the enterprise market and shows every indication of trickling down to the lower tiers.
In response to recommendations from the Government Accountability Office (GAO) and the Department of Homeland Security’s (DHS) Office of Inspector General, FEMA has posted a notice of proposed rulemaking in the Federal Register seeking comment on the concept of a disaster deductible for states and local governments in lieu of raising the threshold for disaster declarations.
The concept of the deductible would be tied to a predetermined “level of financial commitment” as a condition of eligibility for financial assistance under the Public Assistance Program made available through presidential disaster declarations.
The overall goal is to reduce the burden on taxpayers through mitigation incentives and risk-informed decisions that promote resilience.
Faced with the recommendations from the GAO and Office of the Inspector General that would raise the threshold for disaster declarations, which the agency thought would be regressive and put many states in a precarious position, FEMA staff came up with the deductible concept but is seeking details from state and local emergency managers. “This is not a done deal; this is a concept that we’re asking the state and local emergency managers to weigh in on,” said FEMA Administrator Craig Fugate. “We still have to respond back to the GAO and Inspector General about how we are going to address their concerns that the threshold for getting a declaration is too low.”
A new survey of 402 small and medium-sized businesses (SMBs) indicated the majority of IT service providers received lower client satisfaction ratings than business-to-business (B2B) service industry benchmarks.
The survey, conducted by B2B ratings and reviews website Clutch, showed IT services firms averaged a Net Promoter Score (NPS) of 13; comparatively, B2B companies typically average a 20 to 25 NPS, according to Clutch.
What factors are affecting SMBs' views of IT service providers? Clutch noted that specialized IT services firms "may lack the ability to deliver strategic, top-level recommendations for their clients."
Without skilled professionals running the operations, how effective are our security systems? More importantly, how mature are these security systems?
According to Hewlett Packard Enterprise’s newly released study, State of Security Operations Report 2016, companies are failing when it comes to security monitoring and goals. The report measured four areas of performance in security maturity: people, processes, technology and business function. As the report stated:
The reliable detection of malicious activity and threats to the organization, and a systematic approach to manage those threats are the most important success criteria for a mature cyber defense capability.