As small businesses prepare for 2014, they shouldn't focus solely on increasing their bottom lines.
Paychex, a provider of payroll, human and benefits outsourcing solutions, says it's equally as important for small businesses to be aware of the legislative issues that could affect their operations in the year to come.
"Navigating all of the legislative and regulatory changes that occur throughout the course of the year can be challenging, taking business owners away from other important aspects of running their businesses," said Martin Mucci, Paychex president and CEO.
CIO — The holiday season is a great time to look back at the year, with an eye toward what we in the ever-changing world of information technology can expect in 2014. These three trends warrant your close attention in the new year.
In Light of NSA Revelations, Companies Will Be Wary of the Cloud
For most businesses, 2013 was the year of the cloud. Companies that still hosted their email in house would in large part move that expense and aggravation to someone else. Microsoft SharePoint and other knowledge management solutions could be run in someone else's datacenter, using someone else's resources and time to administer, thus freeing your own people to improve other services or, gasp, work directly on enhancing the business.
But then Edward Snowden came around in June and started to release a series of damning leaks about the United States National Security Agency's capability to eavesdrop on communications. At first, most folks weren't terribly alarmed. But as the year wore on, the depth of the NSA's alleged capabilities to tap into communications — both with and without service provider knowledge — started to shake the faith of many CIOs in the risk/benefit tradeoff for moving to cloud services.
Data center infrastructure will undergo dramatic change across the board in the coming year, but while much of the focus will be on software-defined architectures and cloud computing, bare metal changes are on tap as well.
This is actually quite a heady time for servers in particular, given that the pressure to revamp data-handling capabilities is mounting as the enterprise struggles to meet the challenges of mobility, Big Data, collaboration and other macro forces.
For InterWorx’ Graeme Caldwell, the rise of high-volume/small packet data traffic will lead directly to the ARM architecture finally breaking the “x86 monoculture” that has gripped the enterprise for so long. ARMs thrive in the chaotic universe of mobile data, so if the enterprise wishes to scale resources up and down to suit ever-changing load volumes, they would be better off with legions of low-power ARM units at their disposal than highly virtualized x86 machines. And while Intel currently holds a slight edge with its 64-bit Avoton SoC, the coming year will see 64-bit ARMs from Caldexa, Applied Micro and others.
The coming year will be a pivotal one for a wide range of data center components including everything from servers and storage to the virtual layer and cloud architectures. But before I get to all of those, I thought it would be a good idea to see what is likely to happen to the data center itself. After all, with enterprise infrastructure poised for some truly wide-scale distribution, the data center is increasingly being viewed as a single component of perhaps a global data environment.
And while some may argue that the data center will diminish in importance as responsibility for actual physical layer infrastructure falls to the cloud provider, the fact remains that for the coming year, at least, enterprises of all sizes will rely on their own data facilities to a higher degree than in years past.
If you can see what will happen in the future, you can take steps to prepare for it – or avoid it, or even change it. That’s the promise of predictive analytics, a topic that naturally interests business continuity managers. While there’s no guarantee of exact predictions, predictive analytics can indicate change patterns and emerging trends. Sensibly constructed models can show areas of combined high uncertainty and influence, where particular attention should be paid in preparing to ensure continuity. However, predictive analytics as such fall short in two areas related to business continuity: one of them can be ‘fixed’ by using a similar approach, whereas the other cannot.
Many folks take the days between Christmas and New Year’s off. Others, of course, have to work, despite the consumption of too much egg nog.
If you do have to work, it makes sense to be as productive as possible. This year, keep in mind that the late fall has been characterized by winter-like weather. It is not a good sign that suddenly the people who are in charge of this sort of thing have decided to name the storms that seem to be meandering from west to east on a regular basis.
So why not focus on a business continuity plan? These templates are vital, and may come in handy very quickly.
Nothing happens without good planning and implementation strategies and this is required when planning out the development of the Business Continuity Management (BCM) / Disaster Recovery (DR) program. It’s impossible to just start something without having any idea when you’ll be finished or what you need to reach along the way to be able to take the next step.
Often, to get proper buy-in from executives, a BCM/DR practitioner has to provide a timeline alongside the goals and deliverables the project will provide. Its one thing to provide the reasons why you need a program and if those are accepted by executives as valid reasons (let’s hope they think so…), the next question will be, “When will it be done?” So, a draft timeline must be mapped out; from how long a BIA will take and when the findings will be delivered to when the 1st test will occur.
Of course, it will all be built upon assumptions such as resource availability for example, but a high-level timeline must be provided to executives. Below are ten considerations a practitioner must keep in mind when building the BCM/DR program:
IDG News Service (Boston Bureau) — While the bulk of enterprise software is still deployed on-premises, SaaS (software as a service) continues to undergo rapid growth. Gartner has said the total market will top $22 billion through 2015, up from more than $14 billion in 2012.
The SaaS market will likely see significant changes and new trends in 2014 as vendors jockey for competitive position and customers continue shifting their IT strategies toward the deployment model. Here's a look at some of the possibilities.
A storm that left at least nine people dead and more than 400,000 without power this weekend was pushing its way into Canada on Sunday, but holiday travelers may still face slick roads as the system douses the Southeast with heavy rainfall.
The storm that brought high winds, ice, snow and rain to a wide swath of the Southeast before roaring north will affect sections of the USA through Monday night, said Frank Strait, senior meteorologist with AccuWeather.
"The main part of the storm is pulling away into Canada now and taking some of the snow with it,'' Strait said. But a lingering cold front could stretch from Virginia to Pensacola, Fla., causing heavy downpours before the system finally begins to weaken.
Distributed denial-of-service (DDoS) attacks certainly aren’t new. I’ve been talking about them for years. However, they have been changing. The traditional style of attack, the flood-the-target type that crashes a website, is still going strong. But now we are seeing an increase in application-layer attacks that have the same goal: Systems go down, resources are unavailable and the victim is scrambling to fix everything.
Recently, Vann Abernethy, senior product manager for NSFOCUS, talked to me about the changing DDoS landscape. Something he has noticed is how DDoS attacks are being used as smokescreens to cover up other criminal activity. He said:
In fact, the FBI warned of one such attack type back in November of 2011, which relies upon the insertion of some form of malware. When the attacker is ready to activate the malware, a DDoS attack is launched to occupy defenders. In this case, the DDoS attack is really nothing more than a smokescreen used to confuse the defenses and allow the real attack to go unnoticed – at least initially. Considering that most malware goes undetected for long periods of time, even a small DDoS attack should be a huge red flag that something else may be going on.