The data center is dead. Long live the data center.
This may be a bit premature, but if the traditional enterprise data center is not dead yet, it certainly is approaching the twilight of its years.
The latest word from 451 Research is that enterprise data center construction is essentially flat across the globe while the new crop of cloud-facing, hyperscale facilities is on the rise. Results for the fourth quarter of 2014 have the installed base growing a paltry 0.2 percent to 4.3 million facilities, propped up only by increased activity among the cloud, service provider and multi-tenant sectors. Enterprise IT still controls an overwhelming portion of the worldwide data infrastructure, some 95 percent, and maintains about 83 percent of data center square footage, according to the report. But for now at least, the trend lines are clearly pointing away from owned-and-operated data center facilities toward more cloud- and service-based activity.
The line between consumer and business technology has gotten increasingly blurry during the past decade. Consumer devices are almost indistinguishable from enterprise gear. But the gap between software and applications in each category is far wider.
That’s a good thing to understand as wearables become more common at work. This conversation between Jim Haviland, VoxMobile’s chief strategy officer and IT Business Edge’s Don Tennant gives a good overview of the current situation with wearables. At one point, Haviland makes clear that the real action will be on the software front:
Hardware always gets the headlines, but apps are where the value creation happens in the enterprise. We have been using the mantra, ‘the right information on the right screen at the right time,’ because the key to valuable innovation with mobility is all about application success and user experience. Wearables expand the possibilities for how and when people interact with apps and data, which can lead to more dramatic successes.
Data can really be anything, including images, geolocation figures, texts, numbers or some combination thereof.
Thanks to the Internet of Things, more of that data is actually describing a physical thing. For us sci-fi geeks, that inevitably raises the question: Can data create a virtual world to actually interact with these things?
InfoWorld reports that Space-Time Insight is exploring this idea with a pilot data project. It’s using virtual reality headsets such as the Oculus Rift as a way to interact with the data.
The company’s data has a unique physicality to it, since it’s a B2B partner for power, oil and gas, logistics and related industries. For instance, in the power industry, the company collects data about transformers. Space-Time Insight’s solution allows you to see a 3D model of a transformer, as well as any warning signals about what’s wrong. Users could even bypass another application, acting from the 3D space or calling in a work team, InfoWorld reports.
Throughout the last few years social media has become a key communications strategy for emergency managers. Whether it’s for sharing preparedness messages during blue-sky times or getting crucial information out in real time during an emergency, platforms like Twitter and Facebook are now part of nearly every agency's public-outreach plan. This evolution in crisis communications has been followed by many, and a recently released study sought to understand what affected populations, response agencies and other stakeholders can expect from tweets in various types of disaster situations.
The study, What to Expect When the Unexpected Happens: Social Media Communications Across Crises (PDF), examined tweets posted during 26 emergency situations in 2012 and 2013. With the goal of measuring the prevalence of different types of tweets during various situations, the researchers examined both the information and its source.
The tweets were classified into six categories, and researchers determined the average percentage of tweets for each: affected individuals (20 percent), infrastructure and utilities (7 percent), donations and volunteering (10 percent), caution and advice (10 percent), sympathy and emotional support (20 percent), and other useful information (32 percent). Tweets classified as other useful information varied significantly, the report says. “For instance, in the Boston bombings and LA Airport shootings in 2013, there are updates about the investigation and suspects; in the West, Texas, explosion and the Spain train crash, we find details about the accidents and the follow-up inquiry; in earthquakes, we find seismological details.”
(TNS) — When tornado sirens went off in Logan County on May 24, 2011, three Guthrie churches that had volunteered to serve as storm shelters were quickly overrun — and not just by people.
Dogs, cats and birds were packed together in church basements with residents looking to escape the tornado, said Logan County Emergency Management Director David Ball. One man showed up to a church with a boa constrictor wrapped around him, Ball said.
While everyone else was jockeying for space, the man and his snake always seemed to have plenty of room to themselves, Ball said.
Ball spoke Tuesday at the National Tornado Summit in Oklahoma City. Since the May 2011 storm, emergency managers have increasingly concluded that public shelters can do more harm than good, he said. Convincing residents to take steps to make sure their homes are safe during tornado season can be a challenge, he said, but it’s the most viable way to keep residents safe.
What worries chief information officers (CIOs) and IT professionals the most? According to a recent survey commissioned by Sungard Availability Services, security, downtime and talent acquisition weigh heaviest on their minds.
Due to the increasing frequency and complexity of cyber attacks, security ranks highest among IT concerns in the workplace for CIOs. As a result more than half of survey respondents (51%) believe security planning should be the last item to receive budget cuts in 2015.
While external security threats are top of mind for IT professionals, internal threats are often the root cause of security disasters. Nearly two thirds of the survey respondents cited leaving mobile phones or laptops in vulnerable places as their chief security concern (62%), followed by password sharing (59%). These internal security challenges created by employees, lead 60% of respondents to note that in 2015 they would enforce stricter security policies for employees.
Second to security, downtime is also a leading concern for CIOs. Two in five (42%) respondents consider the testing of their disaster recovery plans vital to their organizations and also among the last line items that should be cut from 2015 IT budgets. Not only is downtime expensive, but the damage to an enterprise’s reputation far outweighs the monetary costs.
Disaster recovery testing dramatically reduces downtime (by 75%) for enterprises deemed 'best-in-class' in disaster recovery and business continuity. In addition, according to the Aberdeen Group, those that adopt strong resiliency plans can expect 90% less downtime per event compared to the industry average.
“Today CIOs are more concerned with the resiliency of their organizations and the consequences a disaster can have on an organization’s reputation and revenue stream,” said Keith Tilley, executive vice president, Global Sales & Customer Services, Sungard AS. “The implications that information security and downtime threats place on a business have evolved and become more complex in the last several years, making it a high priority for CIOs.”
It is not just CIOs and IT professionals who are concerned about the cyber threat. According to the Business Continuity Institute's latest Horizon Scan report, cyber attacks are the biggest concern for business continuity professionals as well with 82% of respondents to a survey expressing either concern or extreme concern at the prospect of this threat materialising. Data breach came third on the list with 75%.
Budding tech entrepreneurs with dreams of being the next Bill Gates should look to BJ Farmer as a shining example of how to succeed in this industry.
Listen to the entire interview click here.
While he may not be quite as successful as Gates (is anyone?), Farmer has enjoyed much more success than most people who start their own tech companies. He the founder and president of CITOC, a Houston-based IT services firm that specializes in providing premium cloud services and Microsoft 365 consulting.
CITOC recently celebrated its 20th anniversary (1995 – 2015), and in that span CITOC (an acronym for Change Is the Only Constant) has received a slew of awards, most notably winning Houston’s Microsoft Partner of the Year Award in 2013 and 2014. In addition, CITOC was listed in the 2011 rendition of Inc.com’s annual Top 5000 list (ranked #3997 for its 2010 revenue of $4.6 million), and it has also been recognized as one of the Top 50 fastest growing tech companies in the Houston metro area seven years running by the Houston Business Journal.
We previously talked to Farmer about a client prospect of his that had a rotating cycle of CIOs being hired and then soon leaving, and this was costing them a lot of money. We wanted to catch up with Farmer on how he helped this client.
Why are your customers using the cloud? Why aren’t others using it? As an MSP working with cloud-based file sharing, you should know what motivates your clients and prospects to either adopt or avoid the cloud.
Results from a new survey offer an interesting view into what people think of the cloud, how they use it, and what concerns you should address to bring more people into the cloud. Understanding what influences cloud sharing decisions will help you better position your services and be better prepared to handle objections.
Here are some findings from the survey that show why people either are or are not using the cloud, and how you can use that information to your advantage.
HP has published the 2015 edition of its annual Cyber Risk Report, which looks at the security threat landscape through 2014 and indicates likely trends for 2015.
Authored by HP Security Research, the report examines the data indicating the most prevalent vulnerabilities that leave organizations open to security risks. This year’s report reveals that well-known issues and misconfigurations contributed to the most formidable threats in 2014.
“Many of the biggest security risks are issues we’ve known about for decades, leaving organizations unnecessarily exposed,” said Art Gilliland, senior vice president and general manager, Enterprise Security Products, HP. “We can’t lose sight of defending against these known vulnerabilities by entrusting security to the next silver bullet technology; rather, organizations must employ fundamental security tactics to address known vulnerabilities and in turn, eliminate significant amounts of risk.”
Outlines the key considerations when dealing with enterprise software licensing
Mega vendors that have capitalised on the cloud computing revolution through acquisition, innovation and working in hybrid environments are exposing weak links in enterprise customer’s licensing teams, Concorde has said. In its latest white paper, “Understanding IBM Licensing”, the software value management specialist has raised questions surrounding users’ misconceptions of complex licensing programmes, and urges large firms to form improved strategies for both license and vendor management.
Martin Prendergast, CEO and Co-Founder of Concorde, explained: “There has often been a misconception that software asset management is simply about having the deep-seated knowledge about the quantity of licenses within the enterprise. However, over the years this has changed immensely with cloud computing and mobility, and it’s becoming increasingly important to know how to work with vendors closely, and not to overlook important complexities when interpreting and managing software contracts.”
He continued: “Large vendors such as IBM and Oracle, which have been on the acquisition trail in recent years, supply invaluable technology to huge numbers of global organisations to manage critical aspects of their business. But because of this, licensing is often complex, and many end user licensing teams are ill equipped to understand which products are aligned with existing technology, and which are not. It’s time to get ahead of the game, get expert help, and plan the IT estate structure for the future in order to retain control over planned and unplanned change in software licensing.”
Prendergast outlines a few tips for how to optimise complex software license contracts:
1.Don’t discount your existing vendor relationships
When you’re working with a specific account manager, make sure that you leverage your relationships; it can prove to be a lot more useful to you than you think when it comes to an audit. Although IBM, Oracle and a number of other vendors use third party auditors, which will work independently from your usual account manager, your relationship can be key when it comes to negotiating license pricing. If the compliance team reveals a financially penalising license position, keeping you as a customer can be a lot more important than charging you for back maintenance, so make sure that you are forming close relationships built on trust.
2.Evaluate your product substitution options
A number of large software vendors, have implemented the ability for an organisation to “substitute” products for another of the same financial value. While this can provide a flexible option when it comes to planning and business transformation, make sure you check which products can actually be substituted. This information is usually available in the subsection of agreements so make sure you are aware of what you can and can’t substitute upfront.
3.Have complete clarity on Software and Service Special Option Agreements
While customer specific agreements can offer an unlimited range of license options for enterprises that can be catered to the individual needs of the organisation, they must be handled with care. Making sure that you have a clear mutual understanding with your vendor on any phrasing that could be deemed ambiguous can resolve issues that could possibly emerge further on.
4.Be aware of the granular detail of sub-capacity licensing
Even though they are not permitted according to IBM licensing as standard, there are sometimes deals made for sub-capacity licensing if a customer has requested it specifically. However, it is important to make sure that the IBM License Metric Tool is also implemented so that when it comes to audit, it is easier to outline how many IBM products are implemented and on how many servers. With any tool other than ILMT, there is a risk that IBM will have less of a picture and if you can’t present them with the information they require from ILMT, you could risk being charged as if you need a full-capacity license.
To download Concorde’s full “Understanding IBM Licensing” white paper, please visit the Concorde website: http://www.concordesoftware.com/resources-links/understanding-ibm-licensing.aspx