ATLANTA, Ga. — As companies seek to cut costs and improve efficiency, a growing number of businesses encourage or allow their employees to use their own digital devices at work. “Navigating the IT, privacy, security and intellectual property issues was difficult enough before Bring Your Own Device (BYOD) became common,” says attorney and engineer Janine Anthony Bowen, a shareholder in national law firm LeClairRyan’s Atlanta, Ga. office. “But as the trend surges – and the law catches up with it – companies should carefully review their BYOD policies.”
Challenges range from liability for unpaid overtime to stiff legal penalties for failing to preserve data that may be subject to the eDiscovery process, adds Bowen, a member of LeClairRyan’s Privacy and Data Security Practice.
When Hurricane Matthew swept through the Southeastern United States earlier this month, it left behind extensive debris, thousands without power, and many people living in shelters. In North Carolina, Florida, Georgia and South Carolina, meanwhile, a total of 17 people lost their lives.
Today’s storm forecast models are more advanced than ever before. So how is it that so many residents were caught unprepared when Hurricane Matthew swept into their towns earlier this month? The truth is that storms are notoriously unpredictable, and while forecasts can help, they ultimately only go so far. Let’s take a closer look at what went wrong with Hurricane Matthew, along with highlighting the single best way to protect yourself, your loved ones, and the members of your community when a storm is on its way.
Until recently, the conventional wisdom about data storage was that on-premise solutions don’t offer the flexibility or cost savings of the cloud. Enterprises may have concerns about handing over control of their data and IT infrastructure to a cloud provider because they worry about security, but they’re willing to put these concerns aside if they think they can get the scale and storage they need — at a good price.
Depending on your business, this might have been true in the past: If you weren’t dealing with big data sets, sought low latency, and wanted to save money, the cloud may have been the right choice. Spinning disks didn’t offer the performance needed, and flash drives were too expensive to use in bulk.
Recent changes in the storage market have weakened the argument that storage in the public cloud is the only cost-effective option. Your data center doesn’t necessarily have to be built in the cloud if you’re trying to get that magic combination of cost effectiveness and performance. Here’s what’s happening in the data storage market that should factor into your decision making:
There’s no longer any question that AI (artificial intelligence) is transforming the business world, and this is great news when it comes to successfully maintaining a corporate infrastructure modeled on the three pillars: governance, risk management and compliance (GRC).
Until now, the demands of GRC have been coupled with a spiraling need to increase productivity and cut costs in a hypercompetitive marketplace, turning this near impossible feat into a never-ending and often losing battle. But with the introduction of cutting-edge AI and NLP (natural language processing) technologies into the workplace, companies are discovering they can turn impossibility into reality.
Artificial intelligence has become an indispensable tool for humans to gain support in pretty much every aspect of running a business, and the methodology behind effective GRC is no exception. Much of a company’s compliance and regulatory measures center on the need for better decision-making; automating the processes that contribute to timelier, more informed decisions are a primary objective of emerging AI solutions.
Earlier this year, a ransomware attack shut down the Lincolnshire County Council’s computer systems. For a week, members were reduced to using pens and pencils after the council refused to pay the $500 ransom demanded by the attackers.
It was a vivid example of the disruption that ransomware can cause security executives, who are girding to contend with targeted ransomware attacks against current and planned cloud deployments.
No surprise there as malicious hackers, clearly creatures of habit, seek out the most promising targets. While the cloud has proven its security critics wrong up until now - it’s actually a lot more secure than many thought a few years ago - targeted ransomware attacks against the cloud are on the increase.
You test, you plan, and you document, but is your business continuity program a sham?
It’s a question a senior executive of a client recently asked me. Sadly, the answer to his question was a resounding “yes!” In many cases, we find that the pretty picture painted by the BCM team is not what it seems when you get up closer and pull the covers back.
Why are so many programs in this state? Well, here are 10 reasons:
The recent DDoS attacks have shone a bright spotlight on the security problems within the Internet of Things. The attacks are also a reminder that cloud security is still a work in progress.
That’s not to say that the cloud isn’t secure; instead, the problem may be the way we think about security and the cloud, as InfoWorld explained:
With DDoS attacks, the tendency is to focus on organizations directly affected. Thus, when hacktivists target financial services or gaming sites, the victims are those trying to access those applications. The information is intact, albeit temporarily unavailable.
With Dyn, however, the target was core internet infrastructure, which means any organization that relies on Dyn or works with a service provider dependent on Dyn is affected.
Of all the ways in which advanced analytics and machine intelligence can impact the enterprise business model, perhaps none is more crucial than its effect on IT itself.
As infrastructure becomes more distributed and data loads become more complex, IT must become more adaptive, even to the point where it exceeds a technician’s ability to collect operating data, figure out what it all means and implement the required changes. So before organizations turn Big Data loose on functions like sales, marketing and compliance, it makes sense to implement it on the infrastructure and operational layers of the data environment itself.
This can be done in numerous ways. Power management firm Eaton recently launched the PredictPulse Insight platform that uses a cloud-based analytics engine to track power distribution throughout the data center to predict failures and optimize efficiency. The system ties into the PredictPulse remote monitoring service to produce a more predictive, proactive model of energy management. Users are provided with real-time data over an online dashboard that details alarm settings, performance metrics, service history and a host of other points, all of which can be accessed by either a traditional web portal or a mobile app.
What does it take to run a smart data center?
For many businesses, the data center is the heart of software technology—the “thing” enabling businesses to do more, efficiently expand their capabilities, and maintain the information necessary to run their business properly. A smart data center is needed to support the demands and application deployment models, such as the Internet of Things (IoT), cloud, platform-as-a-service, software-as-a-service, and other models on the verge of becoming mainstream. As business needs evolve, companies are demanding more from their data centers.
Are data centers up to the challenge?
As some parts of the Northeast experience their first frost/freeze of the season, this is a good time to make some cold weather preparations.
NOAA’s recently issued U.S. Winter Outlook said the development of La Niña, the climate phenomenon and counterpart of El Niño, is expected to influence winter conditions this year.
La Niña favors drier, warmer winters in the southern U.S. and wetter, cooler conditions in the northern U.S. but because forecasters expect it to be weak and short-lived, we probably shouldn’t bet against snow.