Organizations are dealing with more data coming in and out from all sorts of directions these days, without a doubt. Dealing strategically with that data, from integration to analysis, is a huge part of this blog’s goal.
Sometimes, however, you have to stop and smell the tactical. And a recent study conducted by the government IT site MeriTalk raises some BIG red flags about whether federal, state and local governments can manage the influx of data we’re about to see.
The report identifies five factors, which it calls the Big Five of IT, that will significantly affect the flow of data into and out of organizations: Big Data, data center consolidation, mobility, security and cloud computing.
Most IT professionals these days are well aware of the coming changes in data center infrastructure – perhaps not on an intimate level just yet, but many of the basic concepts behind cloud computing and software-defined infrastructure seem clear enough.
Last week, I highlighted some of the thinking around the advent of enterprise-class ARM infrastructure in the data center, with the note that ARMs are primarily suited toward large-volume, small-packet workloads characteristic of mobile and web-facing applications. But while much of the trade press has focused on the ARM ultimately “taking over” the data center, knocking the x86 off its 30-year perch, the reality is a bit more nuanced.
The thing is, web/mobile applications are not the only thing coming the enterprise’s way. There are also things like Big Data, enterprise application processing, and even desktop video conferencing and surveillance data to take into consideration. These functions typically involved lower-volume, large-packet workloads, which are more suited to the x86.
InfoWorld — "We shall fight on the beaches. We shall fight on the landing grounds. We shall fight in the fields and in the streets. We shall fight in the hills. We shall never surrender," said Winston Churchill in his famous June 1940 speech in the face of Nazi attacks on England. His earlier committment to the goal of victory, "however long and hard the road may be," is an apt analogy to the security battles that enterprises face.
The bad guys are persistent and sophisticated, and they're making inroads. It is hard to be optimistic when customers, investors, and regulators expect us to totally protect precious assets and preserve privacy, while some governments and vendors on whom we depend are themselves compromising our data, software, and networks.
The fight for security is harder than ever. Most organizations are fighting today's war with yesterday's tools and approaches -- such as protecting perimeters with passwords and firewalls -- and losing. There is too much emphasis on walling off our data and systems, and a misplaced belief that the secured-perimeter approach is adequate.
CSO — For years enterprises have battled to prevent and manage data breaches, yet the costs associated with data breaches keep climbing higher -- especially for organizations in highly regulated industries. The average cost of a breach today is $188 per record in the U.S, According to the Ponemon Institute, with the total costs of data breach hitting upwards of $5.4 million. Also according to Ponemon average losses are up 18% from the same survey in the prior year.
Our own Global Information Security Survey finds that breach costs are rising, as well, especially for those organizations with less mature security programs.
Is there anything organizations can do to curb rising breach costs? Turns out plenty. And most of it are things enterprises should already be doing.
EXETER, Rhode Island – Carousel Industries, a leading integrator in business practices and technologies, announced today that it has been recognized as the 2013 Polycom Systems Integrator of North America. The award was presented at Team Polycom in Vancouver, Canada. Polycom, a leader in HD video conferencing, voice conferencing, and telepresence enabling, open standards-based video collaboration, recognized honorees based on execution and performance, customer success, business results, and overall commitment to co-developing innovative solutions.
“Partners are a critical component of Polycom’s success,” said Mike Conlon, Vice President of Worldwide Channels, Polycom. “Together, we serve a diverse customer base with unique challenges to solve. Our combined efforts help them succeed and together we will continue to drive growth and broader adoption of Polycom’s video, voice and content solutions."
Carousel Industries won the award due to our performance as an integrator with our partners while still focusing on customer needs and solutions.
"We are honored to be recognized as Polycom Systems Integrator of the Year,” said James Marsh, Senior Vice President of Sales, Carousel Industries. “Carousel remains committed to integration so we can provide an unparalleled customer experience in each of our focused business practices including data, unified communications, voice, video, cloud and services."
Carousel Industries consults, integrates and manages technology solutions that solve business problems and contribute to your organizations’ growth. This includes managed services, voice, video and data solutions, unified communications and cloud solutions.
Today Carousel has over 6,000 customers, including 35 of the Fortune 100. Carousel has been recognized by both VAR and CRN Magazines as one of the top technology integrators in the US and we’ve been listed in the Inc. 500/5000 seven times. Carousel is headquartered in Exeter, RI with over 1,000 employees working from offices in 30 locations across the US, including over 250 service technicians deployed across the country. For more information visit http://www.carouselindustries.com.
FAIRFIELD, NJ. – Continuity Logic, a leading provider of Enterprise Governance solutions, announced that it has been positioned by Gartner in the “Leaders” quadrant, in their recently released “Magic Quadrant for Business Continuity Management Planning Software(1)”. Out of 18 products evaluated, Continuity Logic’s FrontLine Live was evaluated as one of the five “Leaders” for completeness of vision and ability to execute.
“We believe that Gartner’s recognition of Continuity Logic as an industry leader validates our next generation approach to governance, with a powerfully integrated business continuity, disaster recovery, risk, and compliance management solution”, said Tejas Katwala, CEO of Continuity Logic. “By modeling the dependencies of people, process, technology and vendors, FrontLine Live is able to dynamically create and prioritize recovery plans with its RTO engine that is just not possible with static, scenario-based solutions.”
“Through its highly automated design, FrontLine Live is able to deliver on its value proposition of ease of use, rapid time to deployment, and enormously increased operating efficiency”, states Mr. Katwala.
By combining high-value business impact analytics with business rules, regulatory requirements, and its real-time RTO (Recovery Time Objectives) Engine, Continuity Logic’s customers realize tremendous ROI when using FrontLine Live’s Incident Manager (its virtual command center). The software automates incident response, business resumption, technology recovery, and supply chain management with real-time task management and notifications, interactive dashboards, powerful real-time reporting, and self-service features.
As one of the industry’s most complete and integrated enterprise-class governance solutions, FrontLine Live empowers organizations to gain more insight, across more domains, and drive better outcomes throughout every aspect of their enterprise governance program.
 Gartner, “Magic Quadrant for Business Continuity Management Planning Software” (Roberta J. Witty, John P. Morency, August 26, 2013).
About Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
About Continuity Logic
Continuity Logic’s market leading FrontLine Live Platform seamlessly integrates the governance of business continuity, risk, and compliance management with workflow and advanced analytical capabilities. All data, business continuity and technology recovery plans, policies, procedures, regulatory requirements, supply chain information, and all collaborations are actively managed in one environment, via any device, through a simple, easy-to-use interface. FrontLine Live is offered as a cloud service, with complete portability to Fortune 2000 companies, in Healthcare, Financial Services, Insurance, Manufacturing as well as other industries. For more information, visit www.continuitylogic.com.
Having spent 17 years of my career in Asia, I’ve long encouraged IT professionals to consider relocating outside of the United States, not just to advance their careers, but to enable them and their families to reap the many benefits of that experience. So when someone with more than 35 years in high-profile leadership positions who’s a lot smarter than I am says the same thing, I want his voice heard.
Ritch Eich, a management consultant and author of the book, “Leadership Requires Extra Innings: Lessons on Leading from a Life in the Trenches,” strongly encourages young people to expand their global outlook. Eich is a keen advocate of considering overseas relocation, so in an interview last week, I asked him to elaborate on the reasons for his advocacy. He said it’s one of the most important things people can do:
COMPUTERWORLD — California is facing its worst drought in more than 100 years, and one with no end in sight. Conserving water has never been more important, and Silicon Valley has an opportunity to offer technological solutions to the problem.
Consider, for example, the approach the East Bay Municipal Utility District took to encouraging customers to reduce water consumption.
Using technologies not available in earlier droughts, the Oakland-based agency issued report cards on water usage to 10,000 of its 650,000 customers in a year-long pilot program. For instance, EBMUD would put worried-looking smiley faces on the statements it sent to people in two-person households who used more than 127 gallons per day -- the average for a household that size. The statements disclosed each household's actual water usage and urged the customers to "take action" -- and many did.
Did you develop Big Data in a silo?
It’s okay. You can be honest here. You’re among friends. In fact, it’s a safe bet you’re not alone, since experts were predicting this might happen back in 2012. All the signs suggested organizations were developing Big Data in a sandbox; by default that means Big Data often became yet another data silo.
So you’re in good company if you developed your Big Data analytics in a silo, beyond your regular systems.
If past is truly prologue, then it shouldn’t come as a surprise to anyone who has studied the history of data infrastructure that virtualization, advanced cloud architectures and open, distributed computing models are starting to look a lot like the mainframe of old—albeit on a larger scale.
Everywhere you look, in fact, people are talking about pooled resources, higher utilization rates, integrated systems and a rash of other mainframe-like features intended to help the enterprise cope with the rising tide of digital information. Put another way: If the network is the new PC, then the data center is the new “mainframe.”
Of course, this new mainframe data center will differ from the old in a number of ways, most notably in the skill sets and development environments needed to run it. At the recent OCPSummit, for instance, there was no shortage of speakers highlighting the need for organizations to ramp up their knowledge of next-generation virtual and cloud technologies that will pull workaday infrastructure management tasks from physical layer infrastructure to more flexible software-defined constructs. It’s worth noting, however, that the virtualization and resource utilization techniques that ushered in the cloud were not created out of whole cloth during the client-server period, but were in fact carried over from earlier mainframe environments.