Computerworld UK — Most companies are spurning the chance to improve their anti-fraud and anti-bribery efforts by not taking full advantage of big data analysis, according to research from business consulting firm EY.
EY found that 63 percent of senior executives surveyed at leading companies around the world agreed that they need to do more to improve their anti-fraud and anti-bribery procedures, including the use of forensic data analytics (FDA).
The survey polled more than 450 executives in 11 countries, including finance professionals, heads of internal auditing and executives in compliance and legal areas. They were asked about their use of FDA in anti-fraud and anti-bribery compliance programs.
CIO — Project managers are in short supply, and that will leave many organizations woefully disadvantaged as the economy rebounds, according to a recent study by project management training company ESI International.
The ESI 2013 Project Manager Salary and Development Survey, based on data from 1,800 project managers in 12 different industries across the U.S., reports that as projects continue to increase in complexity and size, many organizations find themselves both understaffed and with underdeveloped project management professionals. And that's putting them at a competitive disadvantage.
"Budget constraints, an aging base of professionals and a looming talent war all contribute to a talent crisis that should be addressed from the highest levels of the organization," says Mark Bashrum, vice president of corporate marketing and open enrollment at ESI International.
Big data frightens me sometimes. Seeing this headline from Information Week, “IBM: We'll Stand Up To NSA,” gave me heart palpitations.
It sounds noble and all, but really, when a large corporation is willing to stand up to the NSA over data about you or your company … is there any chance you’re winning? It reminds me of Tolkien’s “The Hobbit,” when the trolls are arguing over how to cook the dwarves. Roasting, boiling or jelly — it’s all the same to the dwarves in the end, right?
David J. Walton, a litigator who specializes in technology issues, took a look at how companies are really using Big Data. He’s an attorney, so this isn’t about business cases, ROI or any of that stuff — it’s about law, and when viewed through that lens, this is Brave New World stuff.
It has been a decade since VoIP became a standard telecommunications tool. Its age has not slowed the development of the technology, however. For instance, Twilio this week announced a VoIP advancement that it says could improve ease of use of enterprise-based systems.
According to GigaOm, Twilio will use an approach called Global Low Latency (GLL), which repurposes the approach used by the public switched telephone networks (PSTN) that VoIP is displacing.
A call on the PSTN offers great quality because a circuit is guaranteed. VoIP, to this point, has cut costs by sending packets via the best available path. Though cheaper, this approach introduces imperfections. Twilio’s idea is to limit the extremes of the traditional VoIP approach:
CSO — "Data Lake" is a proprietary term. "We have built a series of big data platforms that enable clients to inject any type of data and to secure access to individual elements of data inside the platform. We call that architecture the data lake," says Peter Guerra, Principal, Booze, Allen, Hamilton. Yet, these methods are not exclusive to Booze, Allen, Hamilton.
"I have read what's available about it," says Dr. Stefan Deutscher, Principal, IT Practice, Boston Consulting Group, speaking of the data lake; "I don't see what's new. To me, it seems like re-vetting available security concepts with a name that is more appealing." Still, the approach is gaining exposure under that name.
In fact, enterprises are showing enough interest that vendors are slapping the moniker on competing solutions. Such is the case with the Capgemini / Pivotal collaboration on the "business data lake" where the vendors are using the name to highlight the differences between the offerings.
Shadow IT is a fact of life for nearly every IT department across the board. But does that mean it’s time to throw in the towel? Not exactly, but it does mean that things will have to change, both for users and managers of data infrastructure.
First, some numbers. According to CA Technologies, more than a third of IT spending is now heading to outside IT resources, and this is expected to climb to nearly half within three years. The figures are shocking, but keep two things in mind: First, they come from CA, which makes its living building systems that help organizations keep track of their data infrastructure, and second, they represent all outsourcing activity, not just what is termed “shadow IT.”
In this article Charlie Maclean-Bristol, a highly experienced business continuity consultant, lists ten areas where many business continuity plans can be improved. How does your plan stack up?
Charlie’s list is as follows:
1. Scope. On many of the business continuity plans that I see it is not clear what the scope of the plan is. The name of the department may be on the front of the plan but it is not always obvious whether this is the whole of the department, which may cover many sites, or just the department based in one location. It should also be clear within strategic and tactical plans what part of the organization the plan covers. Where large organizations have several entities and subsidiaries it should be clear whether the tactical and strategic plans cover these.
2. Invocation criteria. I believe it should be clear what sort of incidents should cause the business continuity plan to be invoked. I also believe that these invocation criteria should be ‘SMART’ (specific, measurable, attainable, realistic and timely), so as not to be open to misinterpretation. The criteria should be easy to understand so if you get a call at 3am in the morning to inform you of an incident it should be obvious whether you invoke or not. Focus should be on the loss of an asset such as a building or an IT system, not on the cause of the loss. There needs to be a ‘catch-all’ in the invocations criteria which says 'and anything else which could have a major impact on our operations’ so that the criteria are not too rigid if you need to invoke for an incident you have not yet thought of.
Costs and benefits of BCM: let us ask the right questions, not answer the wrong ones
By Matthias Rosenberg
The costs and benefits of BCM : I have dealt with this issue for almost 20 years now and it always goes back to one question: Why would a company invest in something that does not provide a contribution to revenue and that is meant to protect the company against something that hopefully never happens? This question is quite understandable from a business perspective and therefore justified as a basic question. Those who cannot give a plausible answer to this question will fall at the first hurdle. This issue is fundamental to our profession and at the same time underrepresented in the BCM literature. Even the Good Practice Guide (GPG) 2013 does not see the task of selling BCM as a central task of a BC manager; but in reality the sale and presentation of the business continuity topic are critical for our success.
Soft skills are as important in BCM as in any other management discipline.
Let me give you some examples: BCM professionals need strong presentation skills and they need strong training skills (e.g. to train BCM coordinators). These are specific skills that can be described. Analytical skills (e.g. to prepare BIA results for top management) and communication skills are equally important. In the end it is not enough to read another BCM standard, to take part in a training course or to buy BCM software and hope to run a BCM programme successfully. A BCM professional needs experience and one of the most important skills to implement a BCM programme successfully: patience.
By Jayne Howe
The costs associated with developing and implementing a business continuity program in your organization can vary greatly. Most of the cost variables are going to be dependent on two factors: what you already have in place and what components still need to be addressed; and whether your organization has internal business continuity expertise.
It’s likely that any organization successfully operating in this century will have at least a few basic components in place. They may be components that are necessary to be eligible for insurance coverage; to meet the criteria for regulatory bodies that your organization’s industry needs to be part of; or complying with basic building fire codes. But even if you don’t have internal BC expertise, you don’t need to start with a blank piece of paper to try to configure the other components that are necessary for a complete and robust business continuity program.
Using a business continuity standard as a base guideline for your own internal development can assist in identifying those modules that are necessary to develop an all-inclusive and comprehensive BC program. This can be extremely helpful in preventing you from travelling down an incorrect or incomplete path, and therefore saving wasted resource time and costs.
Managing vulnerabilities in a business context.
By Paul Clark
Network security can be both an organization’s saviour, and its nemesis: how often does security slow down the business? But security is something you can’t run away from. Today’s cyber-attacks have a direct impact on the bottom line, yet many organizations lack the visibility to manage risk from the perspective of the business.
Traditionally, network security revolves around scanning the servers for vulnerabilities, reviewing them and the risk to the server by drilling down through the reporting to assess how vulnerabilities could be exploited, and then looking at how those risks can be remediated. Looking at vulnerabilities in this technical context leaves a lot to be desired in terms of actual impact on the business.
These risks can be put into two groups. There is the security risk, which is about compromise. How can the network be compromised and what would happen if the vulnerability was exploited? What damage would be done, and what information could be lost? Assessing these types of risk is usually the domain of the information security team.