Protiviti recently partnered with North Carolina’s State University’s ERM Initiative to conduct its second annual ‘Executive Perspectives on Top Risks Survey’. This obtained the views of more than 370 United States-based board members and C-suite executives about risks that are likely to affect their organization in 2014.
Key findings included:
- The overall survey responses suggest a business environment in 2014 that is slightly less risky for organizations than it was a year ago - however, board members view it to be more risky this year compared to 2013.
- Regulatory change and heightened regulatory scrutiny represents the top overall risk for the second consecutive year.
- Cyber threats and privacy/identity management are seen as an increasing threat.
The top 10 risks as perceived by executives are:
According to the Philadelphia Business Journal and other internet sources, hackers apparently accessed Target's data base via a subcontractor's data credentials.
The Wall Street Journal reports that a Pittsburgh PA refrigeration contractor began working with Target in 2006 installing and maintaining refrigerator systems in stores as the discounter expanded its fresh food offerings. Through that relationship, the contractor was linked remotely to Target's computer systems for "electronic billing, contract submission and project management.
Target's liability comes from its IT security advisors' failure to ask the important "What if" questions.
Of course, there’s a personal impact too.
The just-released 2014 Identity Fraud Report by Javelin Strategy & Research reveals that data breaches are now the greatest risk factor for identity fraud.
In 2013, one in three consumers who received notification of a data breach became a victim of fraud, up from one in four in 2012, the report found.
Some 46 percent of consumers with breached debit cards in 2013 became fraud victims in the same year, compared to only 16 percent of consumers with a social security number breached.
National Business Ethics Survey by Ethics Resource Center Reveals Decline in Workplace Misdeeds, Improvement in Ethics Culture in Past Six Years
ARLINGTON, Va. — Research released today by the Ethics Resource Center (ERC), America’s oldest nonprofit advancing high ethical standards and practices in public and private institutions, reveals that workplace misconduct is at an historic low, having steadily and significantly declined since 2007.
The eighth National Business Ethics Survey (NBES) shows that 41 percent of more than 6,400 workers surveyed said they have observed misconduct on the job, down from 55 percent in 2007. In addition, the report found that fewer employees felt pressure to compromise their standards, down to nine percent from 13 percent in 2011.
Noted Michael G. Oxley, ERC Chairman of the Board, former Congressman and House co-sponsor of the Sarbanes-Oxley Act of 2002, “Companies are working harder to build strong cultures and implement increasingly sophisticated ethics and compliance programs. The results of the survey are encouraging and show that companies are doing a better job of holding workers accountable, imposing discipline for misconduct and letting it be known publicly that bad behavior will be punished.”
Whether based on a whistleblower complaint or because you are subject to an inquiry from a governmental agency, a company faced with potential employee misconduct must perform an internal investigation. The goals of an internal investigation are to understand the nature and scope of the issue(s) and to take necessary remedial action promptly. To be truly effective, an organization should aim to achieve these goals while minimizing the impact on the company’s routine business operations.
Unfortunately, companies often inadvertently overlook certain issues in this process, which can result in an ineffective investigation and may pose additional litigation risks for the company.
Here is a list of five factors often overlooked when conducting an internal investigation:
It started with IT server virtualisation and then continued with cloud computing. Instead of physical machines running a company’s own software applications, we now simply have interfaces to virtual instances of these things. Computing resources are no longer located in a specific piece of equipment on a company’s premises. They are ‘somewhere’ in the cluster of virtualised servers, or on the network, or in the cloud. Software as a Service (SaaS) takes it all a step further: now not only are businesses relieved of the need to buy and run their own hardware, but there’s someone else to look after the software too. The potential advantages of budget flexibility, resilience and scalability are clear. But that doesn’t change the need to continually verify solid business continuity management, from one end right through to the other.
By Geary Sikich
If we agree on the basic premise that business continuity can be defined as sustaining what is critical to the enterprise’s survivability during periods of discontinuity; then we must recognize that the activity known as the business impact assessment / analysis (BIA) needs to be redefined.
The BIA, as currently practiced does not necessarily achieve the following:
- Define what is critical to the organization;
- Develop strategies to recover/sustain during times of discontinuity.
I posit a two-phase BIA framework consisting of a pre-event general analysis and a post-event identification and assessment of business impacts and potential consequences for the enterprise.
Events are nonlinear and therefore carry uncertain outcomes. As a result, traditional pre-event BIAs are of little value when conducted using concepts such as mission critical, recovery time objectives, recovery point objectives, etc. Events evolve; the elements of randomness and nonlinearity create opaqueness (opacity: the quality of being difficult to understand or explain) that a traditional BIA underestimates.
By Mark Kraynak
Gartner predicts that global spending on public cloud services will grow from $155 billion this year to $210 billion in 2016. The forces driving enterprise IT to the cloud are faster deployment and easier management, which translate in the end to less cost. But at the same time, cloud deployment is significantly increasing security and compliance risk because security solutions have not kept up – leaving high value assets seriously exposed.
So what are some of the security gaps exposed by this ‘cloudification’ of the data center? They include:
The subject of cloud costs keeps popping up in IT circles, most likely the result of more than two years’ worth of experience in shifting enterprise workloads off of traditional data center infrastructure. Increasingly, though, it seems that the cloud is not always the best choice for the pocketbook, particularly when long-term, scale-out architectures are needed.
I touched on this last month when I discussed a number of new analyses that claim internal enterprise resources can be delivered quite efficiently and at broad scale provided they are housed on the same virtual, federated infrastructure that powers most cloud services. Rob Enderle, for example, pointed out that private clouds can come in at half the cost of leading public services depending on the type of workload and the amount of data involved. A key factor in this disparity turns out to be rogue cloud deployments, which can often lead to redundancy and data duplication.
IDG News Service (Boston Bureau) — CIOs still have the last word over most IT spending but over time they will work more closely with business units on buying decisions, a Forrester Research survey finds.
Only 6.3 percent of new technology purchases in the U.S. were made and implemented solely by business units in 2013, according to the report's author, Forrester vice president and principal analyst Andrew Bartels. Some 9 percent of spending involved technology the business unit chose but the CIO's team implemented and managed.
However, "the ideal tech-buying process is one in which the business and the CIO's team work together to identify a need, find and fund a solution, choose the right vendor or vendors, implement it, and manage it," Bartels wrote in the report. "We estimate that more than a third of tech purchases will fit that profile by 2015."