Industry Hot News (6692)
(TNS) - Recovery efforts continue in North Mississippi after damaging storms and tornadoes swept through the region Christmas Eve eve and Christmas Day.
As the clean-up continues, municipalities and counties across the state have shown support for the communities affected.
A group of Tupelo city officials and Mayor Jason Shelton traveled to Holly Springs Thursday morning to meet with Holly Springs Mayor Kelvin Buck, who is a Tupelo native.
The group spoke with Buck about their own experiences dealing with the aftermath of the Tupelo tornado in April 2014.
(TNS) - Despite extensive flooding in the St. Louis region, hospital officials say it's business as usual.
Days after intense rains, area rivers are pouring into homes and spilling onto major thoroughfares, impeding access to hospitals south of St. Louis.
SSM Health St. Clare Hospital in Fenton near Highway 141 and Interstate 44 is one of the hospitals located in an area with limited access to major highways.
Jamie Sherman, spokeswoman for the Creve Coeur-based health system, said despite major flooding there has been no influx of patients or need for emergency services.
(TNS) - The year ended Thursday with the Mississippi River cresting at Alton some 3 feet short of the National Weather Service’s original prediction, a New Year’s gift to the city.
“It couldn’t have gone any smoother, with the city staff and volunteers that held it (water) back with a 1,000-foot-long wall, it was absolutely amazing,” a relieved Mayor Brant Walker said of the city’s Downtown flood containment efforts.
Walker spoke to a reporter as the river level at Melvin Price Locks and Dam 26 had stabilized at 35 feet plus varying fractions that fluctuated slightly throughout the day. The Service had kept to its prediction of 35.7 feet it made Wednesday for New Year’s Eve Day, making it the fourth-highest river crest recorded in Alton.
If you are thinking about a career change in 2016, then you might want to have a look at the burgeoning cybersecurity market which is expected to grow from $75 billion in 2015 to $170 billion by 2020.
A knack for cat and mouse play may indicate that you have an aptitude for cybersecurity. It is a field where the good guys — cybersecurity professionals — are pitted against the bad guys — cybercriminals a.k.a. hackers. Assuming you’d want to be a good guy – a career can mean a six-figure salary, job security, and the potential for upward mobility.
More than 209,000 cybersecurity jobs in the U.S. are unfilled, and postings are up 74% over the past five years, according to a 2015 analysis of numbers from the Bureau of Labor Statistics by Peninsula Press, a project of the Stanford University Journalism Program.
Following is a summary of key federal disaster aid programs that can be made available as needed and warranted under President Obama's emergency disaster declaration issued for the State of Missouri.
Assistance for the State and Affected Local Governments Can Include as Required:
- FEMA is authorized to provide appropriate assistance for required emergency measures, authorized under Title V of the Stafford Act, to save lives and to protect property and public health and safety, or to lessen or avert the threat of a catastrophe in the designated areas.
- Specifically, FEMA is authorized to provide debris removal and emergency protective measures (Categories A and B), limited to direct Federal assistance, under the Public Assistance program at 75 percent Federal funding.
FEMA’s mission is to support our citizens and first responders and ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.
Stay informed of FEMA’s activities online: videos and podcasts available at http://www.fema.gov/medialibrary">www.fema.gov/medialibrary and http://www.youtube.com/fema">www.youtube.com/fema; follow us on Twitter at www.twitter.com/fema and on Facebook at www.facebook.com/fema.
During the final quarter of 2015 Continuity Central conducted an online survey asking business continuity professionals about their expectations for 2016. Whilst many of the survey findings are similar to the same survey a year earlier, there are some interesting changes.
203 responses were received, with the majority (80.7 percent) being from large organizations (companies with more than 250 employees). The highest percentage of respondents were from the United States (35 percent), followed by the UK (33 percent). Significant numbers of responses were also received from Australia and New Zealand (10 percent) and Canada (5 percent).
The hype around big data and analytics has gone through cycles over the past couple of years, starting with excitement about how much data we have and the potential for it. That moment was followed by that let-down, "now what?" feeling after organizations put the storage and tools in place and found themselves wondering what to do with it. There are so many technologies and trends to track -- machine learning, AI, advanced analytics, predictive analytics, real-time analytics, Hadoop, Spark, other Apache Foundation projects, open source, cloud-based-as-a-service offerings, self-service, and more.
This past year was no exception. Everybody talks about the promise and the potential of big data. Yet there's a sense of disenchantment as CIOs search for use-cases to inspire change inside their own companies. They want to be shown, not told. They want the signal and not the noise.
We noticed that 2015 was a noisy year, and 2016 seems like it will be equally as loud. It's not something that CIOs can afford to tune out. With digital transformations and pure-play startups disrupting established industries -- Uber is the example everyone mentions first -- the pressure is on to leverage data in new ways for competitive advantage. CIOs need to straddle two different worlds -- satisfying their existing customer base while moving fast to deliver instant, data-driven services to customers, or they risk losing ground to market upstarts.
Here are some of the most popular stories that ran on Data Center Knowledge in December.
How the Colo Industry is Changing – Customers are getting smarter about what they want from their data center providers; enterprises use more and more cloud services, and the role of colocation data centers as hubs for cloud access is growing quickly as a result; technology trends like the Internet of Things and DCIM are impacting the industry, each in its own way.
Hot Data Center Startup Vapor IO Raises First Round of Funding – Vapor IO, which came out of stealth earlier this year with a radical new design of the data center rack and sophisticated rack and server management software, has closed a Series A funding round, led by Goldman Sachs, with participation from Austin’s well-known VC firm AVX Partners.
More and more, software functions traditionally executed by the client are now pushed to the server and, moreover, to the Cloud.
One such example is media transformation, like when YouTube allows users to upload a video in one of the several formats, transforms and then serves it in number of formats and resolutions for all common video players; the resulting increase in productivity and convenience is tremendous.
Citrix is moving in the same direction and a recent XenApp/XenDesktop feature (Call Home Telemetry Service) uses Cloud-based Citrix Insight Services (CIS) to bring best experience to customers and Citrix support engineers. Here is a simplified schema of how telemetry facility is typically built:
An earthquake measuring 6.7 magnitude hit northeast India near its border with Myanmar and Bangladesh early Monday, the U.S. Geological Survey (USGS) confirmed. At least eight people were killed and 100 injured by falling debris in Imphal and elsewhere in Manipur state, police said.
The quake struck at 4:35 a.m. local time (6:05 p.m. ET on Sunday), about 20 miles northwest of Imphal, the capital of Manipur.
Media reports said five people were killed by the earthquake in neighboring Bangladesh, but there was no immediate confirmation from authorities.
Strong tremors were felt across the region, the BBC reported. Witness accounts reported a quake that was unlike anything they had felt before, NBC News reported, with residents awakened by shouting relatives and an intense shaking that lasted from 35 seconds to two minutes.
Healthcare is an industry that can benefit significantly from the use of big data and analytics, although it is currently lagging behind in terms of uptake due to the restrictive policy-driven protection that surrounds medical data.
However, as the ability to anonymize data has developed due to new technological innovations, the implementation of successful big data initiatives is likely to have an exponential effect on the industry. This data driven impact is a widely held belief too, with Health IT Analytics claiming that 95% of global healthcare leaders believe patient care is likely to change drastically.
This future may be closer than many people realize and almost every healthcare provider is utilizing data in one way or another at the moment. According to the Guardian, ‘Most healthcare organizations today are using two sets of data: retrospective data, basic event-based information collected from medical records, and real-time clinical data, the information captured and presented at the point of care (imaging, blood pressure, oxygen saturation, heart rate, etc).’ That being said, there are still several limitations to what can be done.
Much of the advice provided to automakers in a new McKinsey report has as its unspoken theme some level of information technology, including software development, data collection and analysis, and Internet of Things connectivity. In fact, the report said, software competence is becoming one of the most important differentiating factors for the auto industry, according to the report.
How automakers manufacture and sell cars has been pretty much the same for the past hundred years. That is about to change, according to a McKinsey & Co. report released today, and information technology -- particularly data collection and analytics -- will play a major role.
According to the report "Automotive Revolution -- Perspective Towards 2030," software competence is becoming one of the most important differentiating factors for the industry in areas including automobile safety systems, Internet connectivity, and infotainment. "As cars are increasingly integrated into the connected world, automakers will have no choice but to participate in the new mobility ecosystems that emerge as a result of technological and consumer trends," the McKinsey report said.
Salesforce has contracted for 40 megawatts of wind power from a West Virginia wind farm, becoming the latest cloud giant to enter into a utility-scale renewable-energy purchase agreement for its data centers.
The purchase covers more capacity than all of the cloud-based business software giant’s servers consume in data centers that host them. Unlike other cloud giants, Salesforce doesn’t own and operate its data centers, leasing capacity from commercial data center providers instead.
While companies like Google, Facebook, and Microsoft, which own and operate a lot of their data center capacity have been signing larger renewable energy purchase agreements and more frequently, there’s been an uptick in renewable energy investment by data center providers this year. This uptick indicates there’s now more interest from major data center customers, such as Salesforce, in carbon-neutral colocation.
Google has bought a defunct semiconductor plant in Clarksville, Tennessee, not far from Nashville, planning to convert it into a data center, state officials announced today.
The company expects to invest $600 million in the project. This will be the eighth Google data center in the US.
Hemlock Semiconductor built the $1.2 billion polysilicon plant in 2013 but did not launch it because of deteriorating market conditions for the material, used to make photovoltaic panels. The site has access to a lot of power and has a lot of infrastructure in place that Google can adapt for data center use.
The escalating threat from cybercrime is set to force companies into increasing the skills of their boardroom executives in 2016, a global security and risk management consulting firm has predicted.
"There is a lack of specialist cyber skills in boardrooms worldwide, which is likely to become increasingly clear as 2016 progresses," said Ed Stroz, executive chairman of Stroz Friedberg.
"Companies are under growing pressure from investors, customers and regulators seeking reassurance that cyber risks are being actively managed and that they have the capability to deal with the aftermath of an incident."
Stroz believes that cyber trends - from hacktivist and insider threats to implications of potential cyber legislation in 2016 - will push corporate boards into reviewing their options to ensure they are better informed and comfortable making risk management decisions.
Geary W. Sikich looks at the emerging business and political risks which organizations need to be aware of and make plans for.
It is December 16th 2015 as I write these lines. Today is Beethoven’s birthday, we are at the yearend and as Christmas approaches it is time to look at what 2016 may bring us. How well will we do, or, how poorly will we perform when, and if, unplanned for crises emerge from threats that we continue to overlook?
My top picks for threats, emerging crisis issues and high impact risks in 2016 and their current status are:
Throughout 2015, Everbridge was proud to work hand-in-hand with corporations of all sizes, across all industries to deliver top-notch security and safety for stakeholders. Corporations are under immense pressure to keep employees, infrastructure and customer safe during various types of events – weather related emergencies, building security failures, data breeches etc. The past year proves how critical it is for corporations to leverage a notification system to communicate with stakeholders and improve business continuity. With 2016 quickly approaching, we took a trip down memory lane and gathered some of our “best of 2015″ quotes, inclusions and testimonials from our partners, employees and customers. Throughout the “best of” list, several themes persist including threat monitoring, IT Alerting and the Internet of Things.
Thanks for taking some time to reflect on 2015 — here goes!
As business continuity professionals, we do our best to make sure that our organizations are able to withstand disruption and carry on in as normal a way as possible. But how do you cope when the disruption is so widespread? Even if by some miracle your organization remains intact and functional, devastation still lies all around you. Your customers and suppliers may not be able to access you. Your customers and suppliers may no longer exist.
This is what the people of Chennai are facing, a city in India where the BCI has only recently set up a new Forum. Torrential rain has resulted in terrible flooding. Hundreds of people have died, and thousands of families have been displaced. As many of us celebrate the season of peace and goodwill, it is important that we share a little bit of that with others. In a season when we can become so obsessed with what we get as presents, it is important that we keep our minds open to what we can give.
It has become traditional for the Business Continuity Institute to make a donation at this time of year and this year we will be sending money to the Chennai Flood Relief Fund being organized by Global Giving. Initially, the fund will help first responders meet survivors' immediate need for food, fuel, clean water, hygiene products, and shelter. Once initial relief work is complete, the fund will transition to support longer-term recovery efforts run by local, vetted organizations. If you would like to make a donation, just visit the Global Giving website.
The BCI wishes all our Chapter Leaders, Forum Leaders, the BCI Board, Global Membership Council and fellow business continuity and resilience professionals around the world, Seasons' Greetings and a healthy 2016.
Note that the BCI Central Office will be closed on the 25th and 28th December and the 1st January 2016, re-opening on Monday 4th January 2016. On the 29th, 30th and 31st December, the office will be staffed between 10am and 3pm only (GMT).
How do you prepare for the unexpected? What can you learn from past severe weather events? Are you ready for the next big El Nino?
Communities along the Southern Pacific Ocean are forced to ask themselves these questions with the upcoming storm, predicted to arrive in January and stay as long as May. El Nino typically cycles every three to seven years and brings unusually wet conditions causing flooding, mudslides, frequent storms, buckled roads, and destroyed homes.  A climatologist at NASA’s Jet Propulsion Lab warned that “these storms are imminent…El Nino is here. And it is huge.”
Communities that are in locations prone to the storm have already started preparing and have learned lessons from the strongest El Nino reported which was in 1988. With this year’s El Nino predicted to be the second largest by the National Weather Service,  no precaution is being overlooked. The California Department of Transportation has increased their maintenance staff by 25% and in Malibu, public works departments will be on call 24/7 during the storm. 
(TNS) - The International Code Council has approved building code changes recommended by the National Institute of Standards and Technology after it conducted an in-depth investigation into the EF-5 tornado that struck Joplin, Mo., on May 22, 2011.
Enhanced protection will be required for new school buildings and additions to buildings on existing school campuses, as well as high-occupancy structures associated with schools where people regularly assemble, such as a gymnasium, theater or community center.
Under the updated codes, storm shelters must be provided that protect all occupants from storms with wind speeds of 250 mph, representing the maximum intensity category of EF-5.
(TNS) - Security at France’s 58 nuclear power plants was purportedly raised to its highest level last month as a result of the terrorist attacks in Paris, stoking concern over the safety of Japan’s nuclear facilities.
After the triple meltdown in Fukushima in 2011, Japan shut down all 48 of its viable commercial reactors in light of the crisis. But attempts are now being made to bring many back online.
And despite opposition from anti-nuclear activists and groups, two reactors in Sendai, Kagoshima Prefecture, were restarted this fall and summer, with applications for 26 more pending Nuclear Regulation Authority approval.
Hoping to exploit the edge over VMware in the enterprise data center it has due to the massive scale of its public cloud, Microsoft is preparing to launch the first preview release of Azure Stack – a private Azure cloud environment a company can stand up in its own data center that will look exactly like the public version of Azure to users and be seamlessly integrated with the public cloud.
This is a similar angle on hybrid cloud VMware has been pursuing since 2013, when it announced its vCloud Hybrid Service that was later rebranded into vCloud Air. VMware promised a virtual extension of a customer’s on-premise VMware environment into the cloud.
The public cloud portion of VMware’s hybrid cloud is hosted in fewer data centers than Azure, relying on smaller footprint in colocation facilities, while Microsoft spends billions of dollars on massive data centers around the world, in some cases building on its own and in other cases leasing large facilities wholesale.
Much of IT security revolves around the question of how much you believe users can think for themselves. Password salting is a solution likely to appeal to those who think users are unreliable, careless or otherwise unable to behave correctly when it comes to the proper use of passwords. Yet the brain is a muscle and needs regular exercise, including password push-ups and security question squats. Which way should you go? To help answer that question, first try our super-fast primer on what password salting actually is; or if you prefer, how to explain its importance to your CEO.
One of the biggest trends in the tech sector at the moment is undoubtedly mobile. With smartphones and tablets becoming more powerful every year, many people now view them as a practical replacement for their desktop or laptop PC.
In the third quarter of 2015 alone, nearly 353 million smartphones were sold around the world – a 15.5 per cent increase over the same period the previous year, according to Gartner. And it is not only in people’s personal lives where these devices are set to make an impact, as businesses across all sectors can expect to see these gadgets entering the workplace more frequently.
Often, business smartphones and tablets won’t be issued by the company, but will be the personal devices of employees. This trend is known as bring your own device (BYOD), and if you haven’t yet encountered it, you can expect to do so sooner rather than later.
Regulatory compliance is a fact of life for every enterprise. And since security has been in the hot seat lately, everyone’s paying more attention – and concern – to compliance. Businesses face increased scrutiny and are tasked with managing a growing number of regulatory requirements that must be met. At the same time, competitive pressures are mounting with the development of new technologies and the evolution of customer expectations for digital experiences. Is it possible for businesses to deliver new products and services at high velocity while still satisfying their obligations for compliance?
In every company, software is playing an increasingly pivotal role. Software-based services are often the primary way a company connects and communicates with customers. From sophisticated banking services accessed entirely through mobile phones and browsers to automobiles differentiated in the market by how well they integrate with the consumer’s technology ecosystem, software is today’s competitive currency.
Enterprises have more motivation than ever to reconcile the conflict between complying with regulatory requirements and competing in the fast-moving digital marketplace. Insert DevOps.
Christmas is rapidly approaching -- but is your customers' sensitive information safe? IT security remains a top concern for many IT professionals, which is reflected in recent data.
The June 2015 Spiceworks Voice of IT survey revealed about three-quarters of IT professionals considered their organizations at risk for technology, IT security and man-made disasters or incidents.
In addition, 60 percent of respondents said they believe their organizations are not adequately investing in IT security.
Managed service providers (MSPs), however, can help customers improve their security and safeguard their sensitive data throughout the holiday season and into 2016.
The recent terrorist attacks in Paris and San Bernardino serve as reminders that man-made disasters are a growing reality in today’s world. Business resiliency, security and information technology professionals know they have a responsibility to prepare their organizations for frightening and disruptive events such as these. Further, these preparations must include methods for communicating across the organization in a secure, rapid and accurate way.
While typical mass notification methods such as SMS, telephony and email are viable channels in many cases, they each have their limitations. Take, for example, SMS service in Paris after the terrorist attack. The volume of SMS spam traffic into France compelled the government to block the delivery of certain types of international text messages (particularly two-way messages). This move negatively impacted the ability of certain businesses to communicate with employees and other stakeholders in the region via this widely-used channel.
Resiliency managers can’t control the actions of foreign or domestic governments. However, they can deploy the latest communication technologies that minimize or eliminate communication barriers, while gaining a greater degree of control over stakeholder interactions.
SACRAMENTO, Calif. – The California Governor’s Office of Emergency Services (Cal OES), the Federal Emergency Management Agency and the U.S. Small Business Administration (SBA) have approved more than $30 million in disaster recovery grants and loans for survivors of the Butte and Valley wildfires.
“The job isn’t finished,” said FEMA Federal Coordinating Officer Tim Scranton. “We continue working with all of our recovery partners through the holiday season to help the survivors and communities in Calaveras and Lake counties recover and rebuild.”
“We have excellent teams who are dedicated to the mission,” said Cal OES State Coordinating Officer Charles Rabamad. “I’m continually inspired by the hard work and focus everyone has on trying to get those who were burned out of their houses into homes for the holidays."
Here is a snapshot of state and federal disaster assistance approved to date:
- The registration period for federal assistance ended Nov. 23, 2015. During that timeframe, more than 3,700 Californians contacted FEMA for information or registered for assistance with FEMA.
- $940,000 approved for survivors through California’s State Supplemental Grant Program.
- More than 1,500 survivor households have been approved for a total of more than $11.5 million in FEMA Individual Assistance grants.
- Of that, nearly $7.5 million was approved in Housing Assistance, which can include grants to help cover home repair and replacement costs as well as financial rental assistance.
- 833 survivor households are receiving rental assistance. Of that number, 606 are renters and 227 are homeowners.
- More than $4 million was approved for Other Needs Assistance, which helps survivors cover the cost of replacing lost contents and other disaster-related expenses.
- SBA has approved $19.2 million in low-interest disaster loans to help business owners and residents with their recovery.
- $16.9 million approved for 190 homeowners and renters.
- $2.2 million for 34 businesses.
- 35 survivor households are currently sheltering at hotels and motels through FEMA’s Transitional Sheltering Assistance program. The program is designed to provide temporary sheltering until alternative housing accommodations are made available.
Helping survivors find a safe, secure temporary place to live is the number one priority of the state and federal recovery team. FEMA is working with eligible survivor households in both counties to ensure their temporary housing needs are met. When it comes to temporary housing for survivors, the first option is always rental assistance as it is the fastest and most efficient form of temporary housing.
FEMA continues connecting eligible survivors with available rental resources within a reasonable commuting distance from their community. For survivors in areas where rental resources are not available, the agency is working to provide Manufactured Housing Units on both private sites and commercial sites.
FEMA, the state and the counties are coordinating to complete debris removal, secure utilities and complete required local licensing to move more Manufactured Housing Units onto feasible private sites. FEMA is also working with property owners at various commercial sites to complete required upgrades and move more units onto those locations.
Survivors can make changes or track their grant status online at DisasterAssistance.gov or by calling 800-621-3362; TTY 800-462-7585; 711 or Video Relay Service (VRS), call 800-621-3362.
Although the deadline has expired to apply for property damage loans from SBA, small, non-farm businesses, small agricultural cooperatives, small businesses engaged in aquaculture and most private nonprofit organizations of any size may continue to apply for an SBA Economic Injury Disaster Loan (EIDL) to help meet working capital needs caused by the disaster. EIDL assistance is available regardless of whether the business suffered any property damage. These loans help meet financial obligations and operating expenses, which could have been met had the disaster not occurred.
EIDL applicants may apply online via SBA’s secure website at https://disasterloan.sba.gov/ela. Disaster loan information and application forms are also available from SBA’s Customer Service Center by calling 800-659-2955 or emailing firstname.lastname@example.org. Individuals who are deaf or hard of hearing may call 800-877-8339. For more disaster assistance information, or to download applications, visit www.sba.gov/disaster.
For more information on California’s wildfire recovery, go to caloes.ca.gov and fema.gov/disaster/4240 and follow us on Twitter @femaregion9 and @Cal_OES, and on Facebook at facebook.com/FEMA and facebook.com/CaliforniaOES.
FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.
Disaster recovery assistance is available without regard to race, color, religion, nationality, sex, age, disability, English proficiency or economic status. If you or someone you know has been discriminated against, call FEMA toll-free at 800-621-FEMA (3362). If you have a speech disability or hearing loss and use a TTY, call 800-462-7585 directly; if you use 711 or Video Relay Service (VRS), call 800-621-3362.
FEMA’s temporary housing assistance and grants for public transportation expenses, medical and dental expenses, and funeral and burial expenses do not require individuals to apply for an SBA loan. However, applicants who are referred to SBA for a disaster loan must apply to be eligible for additional FEMA assistance that covers personal property, vehicle repair or replacement, and moving and storage expenses.
The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling 800-659-2955, emailing email@example.com.
Given the fact that most IT organizations are now storing orders of magnitude more data than they ever did in the past, it should not come as a surprise that usage of data deduplication tools is on the rise. The challenge is that different types of data respond better to different types of data deduplication algorithms.
To make it simpler for IT organizations to invoke those algorithms at the right time, Exablox this week announced that it is adding support for variable-length deduplication to its OneBloxstorage appliances alongside existing support for fixed-length deduplication and inline compression.
Sean Derrington, senior director of product management for Exablox, says that means within the context of a single storage pool, IT organizations can now apply policies to data that automatically invoke the most appropriate approach to data deduplication based on the type of data being stored.
Natural catastrophes made up the lion’s share of global insured disaster losses in 2015, but a man-made loss was the year’s costliest.
Preliminary estimates from Swiss Re sigma put insured losses from disaster events at $32 billion in 2015, of which $23 billion were triggered by natural catastrophes and $9 billion by man-made disasters.
The explosions at the Port of Tianjin, China in August are expected to lead to claims of at least $2 billion, making it the costliest event of the year and the biggest man-made insured loss in Asia ever, sigma said.
(TNS) - Ohio ranks in the bottom tier of states when it comes to preparing for and handling outbreaks of infectious disease, according to a new report.
The state received points for just three of 10 indicators examined in the report, “Outbreaks: Protecting Americans From Infectious Diseases.”
That means Ohio tied six other states — Idaho, Kansas, Michigan, Oklahoma, Oregon and Utah — for last place.
The five highest-scoring states — Delaware, Kentucky, Maine, New York and Virginia — received points for eight of 10 indicators.
A new hazard mitigation plan lays out how local community officials can reduce vulnerability to natural and man-made hazards in Chatham County. That reduced vulnerability, in turn, can lead to lower flood insurance rates.
Emergency planners explained the latest edition of Chatham County's hazard mitigation plan at a public meeting Thursday afternoon at Garden City City Hall.
For the first time, the 2015 plan includes the threat of sea level rise, a reality that's becoming more apparent as high-tide flooding more frequently swamps area roads.
"In all coastal counties we're seeing a lot of that," said Margaret Walton, project manager for Atkins, the consulting company that helped produce the plan.
The media coverage and public debate following the November terror attacks in Paris might give one the impression that ISIS has suddenly become the top cyber threat to Western countries. Officials in France, the U.K., and Canada have seized on the Paris attack to promote a number of cyber security initiatives. In the United States, we have seen a renewed debate over encryption, as well a calls from both leading Democratic and Republican presidential candidates to censor the Internet to combat the threat that ISIS poses there. This is despite the fact that the Paris attacks were not cyber attacks and were planned “in plain sight” and without widespread use of sophisticated encryption technologies by the attackers.
We should ask two questions: First, has our attention really shifted towards ISIS as a cyber threat? Second, if so, is this shift warranted? In short, my answer to these questions is yes, there is reason to believe our attention has shifted, and no, this shift is not warranted.
As I have argued in my previous work, close observers of the history of the U.S. cyber security debate have noted a tendency for cyber threat perceptions to mirror larger national security concerns. That is, the perception of cyber threat actors can be influenced by other perceived threats that are not primarily about cyber security. Paris seems to provide an example of this phenomenon.
Microsoft made a data analytics acquisition. IBM expanded its IoT Watson efforts with new APIs. Apple shut down its Twitter analytics acquisition. For this week's big data roundup, let's start with the threat of evil algorithms and robot overlords.
Well, maybe it's not that drastic, but if that dystopian future is coming, we may be better prepared now, thanks in part to Tesla founder Elon Musk.
Musk, together with several other tech firms and entrepreneurs, are pooling their fortunes to launch OpenAI, a nonprofit artificial intelligence research company. The aim is to advance digital intelligence in a way that is most likely to benefit humanity as a whole, unconstrained by the drive for financial return.
I’ve noticed recently that many individuals working on various projects and programs, including Disaster Planning and Business Continuity, seem afraid to actually communicate some of the difficulties they’re encountering. With most projects and programs, executives and sponsor expect to receive a regular update on the efforts and whether there are any major issues they need to be aware of. In the majority of cases projects are reported on as either being;
1. GREEN – all is well and tracking to schedule, scope and budget;
2. AMBER (Yellow) – some minor hiccups and need to deal with some smaller issues or risks, which may need some participation by the sponsor to ensure scope, budget and schedule get back on track; and,
3. RED – all heck’s has broken loose and we’ve got a major problem.
As we close out the year, it is now time to begin the retrospective reviews and predictions for the New Year. I will try to keep them to a minimum but I find it important to reflect and look forward to new challenges.
Compliance is a fast moving profession. More attention is being paid to the compliance function, and more companies are embracing the importance of compliance. The last twenty years has seen an explosion in enforcement, and the natural response of compliance.
As compliance begins to mature and establish itself on the governance landscape, there are many important challenges and trends. Compliance has to continuously evaluate itself as a function and as a profession. More structure is needed around training, professional standards and formal education programs. Until these issues are addressed, compliance is a profession in search of subject-matter experts.
Structuring a solid data team would be a lot easier if there were a common blueprint that worked equally well for all organizations. However, since each organization is unique and change is constant, companies must continually reassess their needs.
Technology hype and competitive pressures tend to frustrate strategic thinking, however. Instead of defining goals and identifying problems that need to be solved up front, organizations sometimes acquire technology or talent without a plan, which tends to negatively affect ROI.
"You need to have a really well-defined business case beforehand," said Jonathan Foley, VP of science at a recruiting software provider Gild, in an interview. "Companies are building out data science teams before they need them, before they understand what data science is and what is going to be the desired effect on the business. It's a me-too phenomenon where it's seen as something that can have a competitive advantage. But unless the leadership really understands the expected outcome of having data science and machine learning, it just becomes a difficult task. You don't know who to hire and you don't know how to manage the team once you have it."
Amid all the technological changes set to take place in the coming year, the enterprise is on the verge of a momentous operational and organizational transformation as well. One of the most significant aspects of this is the rise of Dev/Ops as the driving force behind the delivery of IT services.
Before long, virtually the entire data stack will sit atop a virtual architecture residing on commodity hardware. Sure, there will always be a need for bare-metal functionality, but even then, those resources will be treated like managed services within an automated, software-defined ecosystem.
This means knowledge workers hoping for a new application won’t have to wait for coders and IT technicians to come together in a months-long development process that usually ends in either marginal success or abject failure. In the future, a combined Dev/Ops team, including the business unit in need of the app, will create the code, test it in the lab, provision the virtual resources, and then launch it into production environments, all within a matter of days or weeks.
Will it be cheaper to run a particular application in the cloud than keeping it in the corporate data center? Would a colo be cheaper? Which servers in the data center are running at low utilization? Are there servers that have been forgotten about by the data center manager? Does it make sense to replace old servers with new ones? If it does, which ones would be best for my specific applications?
Those are examples of the essential questions every data center manager should be asking themselves and their team every day if they aren’t already. Together, they can be distilled down to a single ever-relevant question: How much does it cost to run an application?
Answering it is incredibly complex, which is the reason startups like TSO Logic, Romonet, or Coolan, among others, have sprung up in recent years. If you answer it correctly, the pay-off can be substantial, because almost all data centers are not running as efficiently as they can, and there’s always room for optimization and savings.
The National Guard Bureau will deploy 13 new cyber protection teams composed of about 500 soldiers across the nation to help protect the network infrastructure, the military arm announced Dec. 9. The Air Guard will also deploy four new "Cyber Operations Squadrons" in Idaho, Michigan, Texas and Virginia, along with a "cyber Information Surveillance Reconnaissance (ISR) squadron" in California and a "cyber ISR group" in Massachusetts. Collectively the deployments are geared toward a federal effort to protect against mounting cyberthreats. The teams will run simulations, and share contacts, information and resources with local organizations to help thwart and prevent attacks.
The cyber protection teams will be deployed across Alabama, Arkansas, Colorado, Illinois, Kentucky, Louisiana, Minnesota, Mississippi, Missouri, Nebraska, New Jersey, New York, North Dakota, South Dakota, Tennessee, Texas, Utah and Wisconsin, joining four teams already deployed across California, Georgia, Indiana, Maryland, Michigan and Ohio.
The teams are positioned around the nation's 10 Federal Emergency Management Agency response regions. This infrastructure is needed to support operations in the growing cyber world, said Air Force Col. Kelly Hughes, chief of the Space and Cyber Warfare Operations Division at the Air National Guard Readiness Center.
In case you haven’t seen our latest news, this morning we announced we have received a third-party certificate of HIPAA compliance across all of our facilities, including Mail-Gard, by independent assessor, Crimson Security Inc.
Data security and compliance is critical to all of our customers, but especially to those in the highly regulated healthcare industry. Compliance to HIPAA requirements has always been a focus of our healthcare clients. We are considered a Business Associate under the HITECH Act, which extended our clients’ compliance requirements to companies such as ours.
While the third-party review is a new undertaking, IWCO Direct has focused on HIPAA compliance for years. In fact, our first self-evaluation dates back to 2006. Since that time we have continued annual audits and regular enhancements. However, as a means to measure and assure that our own internal audits and self-certifications were valid, this year we engaged Crimson Security to assess our HIPAA/HITECH control environment. This independent assessment provided us a “second set of eyes” that reinforced our internal security and compliance team efforts, as well as reassured our healthcare client base of our strong corporate security posture.- See more at: http://www.iwco.com/blog/2015/12/18/hipaa-compliance-certificate/?utm_source=IWCO+Speaking+Direct+Newsletter&utm_campaign=7102768a10-RSS_EMAIL_CAMPAIGN&utm_medium=email&utm_term=0_6225488a32-7102768a10-104311797#sthash.6jhG8ION.dpuf
In case you haven’t seen our latest news, this morning we announced we have received a third-party certificate of HIPAA compliance across all of our facilities, including Mail-Gard, by independent assessor, Crimson Security Inc.
Data security and compliance is critical to all of our customers, but especially to those in the highly regulated healthcare industry. Compliance to HIPAA requirements has always been a focus of our healthcare clients. We are considered a Business Associate under the HITECH Act, which extended our clients’ compliance requirements to companies such as ours.
While the third-party review is a new undertaking, IWCO Direct has focused on HIPAA compliance for years. In fact, our first self-evaluation dates back to 2006. Since that time we have continued annual audits and regular enhancements. However, as a means to measure and assure that our own internal audits and self-certifications were valid, this year we engaged Crimson Security to assess our HIPAA/HITECH control environment. This independent assessment provided us a “second set of eyes” that reinforced our internal security and compliance team efforts, as well as reassured our healthcare client base of our strong corporate security posture.- See more at: http://www.iwco.com/blog/2015/12/18/hipaa-compliance-certificate/?utm_source=IWCO+Speaking+Direct+Newsletter&utm_campaign=7102768a10-RSS_EMAIL_CAMPAIGN&utm_medium=email&utm_term=0_6225488a32-7102768a10-104311797#sthash.6jhG8ION.dpuf
A few months ago, I had the opportunity to sit in on a talk given by Christian Karam, a digital crime officer, cyber innovation and outreach, with Interpol, at G DATA’s 30th anniversary celebration. It was a fascinating discussion (and I got to continue it a bit on a shared cab ride with Karam the next day) about how cybercrime is universal yet regional, and how it is continuously evolving.
Karam’s talk focused on the difficulties facing law enforcement when it comes to stopping cybercrime internationally. Unlike security companies, law enforcement – Interpol specifically – isn’t just concerned with stopping cybercrime, but with putting the cybercriminals in prison. Why? Karam said:
If you just stop the criminals from their activities, they will come back with a smarter, faster, more elegant way to do damage.
European Union’s three regulatory bodies have reached an agreement on common rules for governing data privacy across all member states. Europe’s data privacy reform has been in the making for at least three years and now finally appears close to enactment.
While addressing what businesses can and cannot do with users’ personal data and outlining rules for access to personal data by law enforcement, the packages do not address cross-border data flows, which until recently were governed by a set of rules called Safe Harbor but was stricken down by the European Commission, causing a stir in the cloud services industry, where the biggest players are by their nature operating globally distributed data center infrastructure.
“Our next step is now to remove unjustified barriers which limit cross-border data flow: local practice and sometimes national law, limiting storage and processing of certain data outside national territory,” Andrus Ansip, VP for the Digital Single Market, said in a statement on the recent agreement, reached earlier this week. Digital Single Market is an EC initiative to promote a unified single digital economy across the EU, governed by a common set of laws.
The number of Fortune 500 companies successfully using Big Data analytics as a way to improve business intelligence and efficiency is not very high. According to a Forbes article, it’s anticipated that the number could be as low as 15 percent. Among SMBs, that percentage is likely even lower.
One of the reasons so few are using Big Data is due to the lack of skilled professionals available to analyze the massive amounts of information being generated. Companies still don’t understand how to best leverage the collected data.
However, Big Data can be a real asset when properly utilized. It can customize customer offerings based on past purchases, it can anticipate supply and demand, and it can anticipate potential problem points and generate solutions. In short, Big Data can be a game changer for a business, as it was for these companies.
LOS ANGELES — The emails arrived overnight Monday into Tuesday. They threatened the safety of hundreds of thousands of students in the nation’s two largest school districts, promising that a violent plan already had been set in motion and raising the specter of guns and bombs inside numerous classrooms.
New York City officials opted to open their public schools on time Tuesday, calling the message an amateurish hoax imitating a popular television series. But across the country in Los Angeles, Superintendent Ramon Cortines took a different tack, closing every school in his sprawling district in a move that disrupted the daily lives of more than 640,000 students and their families.
Maybe the threat wasn’t real. But maybe it was. And at a time when the world is reeling from terrorist attacks — including two weeks ago in San Bernardino, just an hour’s drive from Los Angeles — Cortines said he had no choice but to be cautious.
At this point, almost every modern data center will have worked with some type of virtualization technology. A recent Cisco report noted that cloud workloads are expected to more than triple (grow 3.3-fold) from 2014 to 2019, whereas traditional data center workloads are expected to see a global decline, for the first time, at a negative 1 percent CAGR from 2014 to 2019.
Traditionally, one server carried one workload. However, with increasing server computing capacity and virtualization, multiple workloads per physical server are common in cloud architectures. Cloud economics, including server cost, resiliency, scalability, and product lifespan, along with enhancements in cloud security, are promoting migration of workloads across servers, both inside the data center and across data centers (even data centers in different geographic areas).
With this in mind, it’s important to note that the modern hypervisor and cloud ecosystem have come a long way. VMware, Microsoft, Citrix, and others are paving the way with enterprise-ready technologies capable of consolidating an infrastructure and helping it grow harmoniously with other tools. Today, many systems are designed for virtualization and cloud readiness. In fact, best practices have been written around virtualizing heavy workloads such as SQL, Oracle, Exchange, and so on. Taking advantage of these cloud-ready platforms will make your data center more agile and more capable of meeting market demands.
A new Webroot survey of 300 IT decision-makers indicated many small and medium-sized businesses (SMBs) intend to increase their security budgets next year.
The survey, titled "Are Organizations Completely Ready to Stop Cyberattacks?," revealed 81 percent of respondents said they plan to increase their annual IT security budget for 2016.
In addition, 81 percent noted they believe outsourcing IT solutions (including cybersecurity endeavors) would increase their bandwidth to address other areas of their business.
"SMBs play a pivotal role in helping drive the economies of all the countries polled, but past experiences have taught them they face an uphill battle when it comes to cybersecurity," said George Anderson, Webroot's director of product marketing, in a prepared statement. "This perception must change."
Technological advances and market forces are driving demand for data scientists, and universities are stepping up to fill the need by expanding their curriculums. Here's a closer look at some of the programs.
Data science isn't new, but as technologies and the job market have changed to create more demand for these skills, university offerings must change, too.
In some cases, existing courses and degree programs are simply being rebranded. In other cases, faculty members are purposely adding data science concepts to existing courses and creating new courses and degree programs.
"A case can be made that every student should develop data science skills. Computational thinking is another core part of the curriculum for a well-educated individual whether or not they become a programmer so they can understand the nature of what's involved and apply critical thinking to data analysis and data analytics," said Dan Lopresti, chair of Lehigh University's department of computer science and engineering, and also director of Lehigh's interdisciplinary Data X initiative.
In theory at least, a standard platform-as-a-service (PaaS) environment should greatly advance hybrid cloud computing by providing a common layer of software that abstracts away the underlying infrastructure complexity. To make sure that actually happens, the Cloud Foundry Foundation (CFF) announced today that it has created a certification through which IT organizations will be assured that multiple implementations of the open source Cloud Foundry PaaS are compatible with one another.
The first providers of Cloud Foundry PaaS software to attain a Cloud Foundry PaaS Certification include CenturyLink, Hewlett-Packard Enterprise, Huawei, IBM, Pivotal, SAP and Swisscom.
Cloud Foundry CEO Sam Ramji adds that this technology certification is the first step in a much broader certification effort that Cloud Foundry will embark on in 2016. As part of that effort, Cloud Foundry is working with some of the leading systems integrators in the industry to create a Cloud Foundry certification for technical professionals as well, says Ramji.
Yossi Ben Harosh is President & CEOof RiT Technologies.
All signs indicate that 2016 will be a year of many challenges. Disruptive technologies will be introduced, the exponential increase in computing power will continue, while businesses will demand a prompt response to quickly changing requirements. At the same time the requirement to be highly resource efficient will stay the same.
As a result of these challenges we predict these changes will emerge in 2016:
(TNS) - During its first meeting since the Dec. 2 terror attack in San Bernardino in which 14 people were killed and 22 wounded, the San Bernardino County Board of Supervisors on Tuesday unanimously approved several measures that will ramp up security at county facilities, seek state and federal funding assistance and extend paid leave for environmental health employees.
The meeting began with an emotional remembrance ceremony for the victims. Board Chairman James Ramos led a prayer.
“We pray for the families of those that are going through this tragic time. We ask now that the continuing of prayers continue to come in to San Bernardino County, and specifically to our (Environmental Health Services) department,” Ramos said.
Each year my team of futurists puts together a list of big trends for the coming year. We analyze how right we were with our “15 for 2015” and compile our “16 for 2016” (they must be dreading 2030). I’m relieved to see our methods are working; in 2015 we were right on the money – and money was one of the major things to change.
2015 saw Goldman Sachs as the first financial juggernaut to invest in Bitcoin, and I started to pay my daily London commute with Apple AAPL -1.83% Pay on my iWatch, along with 40% of Londoners now using contactless payments for the tube; Fintech has now entered a revolution.
We also backed autonomous machines, and the US airspace applications for drones have gone from 1 in 2014 to 50 per week as we stand today (source FAA), leading to a rapid need for “drone-ports,” where I’m sure Amazon will be keen to set up a duty-free shop. Other trends we highlighted included B2B ecommerce now rising at a rate twice as fast as B2C commerce did; “women as a customer” as all industries tackle diversity head on; and one of my personal favorites, and a brave one, was policymakers and diplomats globally coming together on trade and important policies like climate change. It was good to see that we are learning to compromise, as we saw with the climate change agreement.
Identifying and managing emerging risks is perennially a top concern for most organizations, as an unforeseen threat can quickly impact company operations in a significant way. CEB research shows that progressive companies regularly scan for new risks and embed systems and processes that enable them to detect risks early. They also work to uncover risks by encouraging contrarian thinking and questioning strategic assumptions.
With this in mind, every quarter, we survey senior executives in risk, audit, finance and compliance at leading companies on key emerging risks and the potential impact, probability and velocity for their organizations. The dashboard in Figure 1 captures the percentage of survey respondents that select a given emerging risk as one of their top five concerns, giving us insight into which emerging risk events are the most important to companies.
The traditional enterprise vendors’ hold on the data center market is said to be shaky and growing weaker by the day as new cloud and white box solutions come into vogue. But by the numbers at least, it seems like the old guard is holding its own for the moment.
According to Synergy Research Group, HPE, Cisco and Microsoft are tops in the $120 billion data center infrastructure market, which itself is growing at about 3 percent per year based largely on sales of virtualization software, blade servers and security solutions. HPE controls about 25 percent of the market, followed by Cisco at 13 percent, and then Microsoft, which has about 70 percent of the total software spend. Somewhat ironically, the cloud is driving many of these revenue gains by spurring demand for hyperscale and private cloud infrastructure.
And despite what you hear about converged, commodity infrastructure and tightly integrated computing solutions, it seems that the rack server still rules the roost in the data center, says the UK Register. About $10 billion of the $29 billion in sales for the third quarter went to the rack, with growth moving roughly in sync with the overall infrastructure market. And while HPE does rule in the established enterprise market, Cisco is tops in the fast-growing service provider segment, which is eager to match servers with advanced, high-speed networking.
Now that the state legislature has approved tax breaks for data center owners and users in Michigan, the project to convert the pyramid-shaped office building outside of Grand Rapids is a go. Future of the project by Las Vegas-based data center provider Switch hinged on the bill’s passage, and lawmakers rushed it through the legislative process to get it approved before the holidays.
The bill, passed by the state House Tuesday, now heads to Governor Rick Snyder’s desk for signing. In a phone interview, Switch CEO Rob Roy said the company has decided to go ahead with its Michigan data center construction plans “100 percent.”
Those plans call for 2 million square feet of building space, including the Steelcase Pyramid and several additional buildings Switch plans to erect around it. The full build-out could take up to 10 years and include six buildings, the company’s spokesman Adam Kramer told us earlier.
When you think of safety culture, what comes to mind? Perhaps it is visions of hallway walls plastered with safety advisories, or the common “Safety First” banner that is hung high over the manufacturing or production floor. While these visual aids might make an organization appear safety-oriented, they are often not enough to build a true culture of safety.
Safety culture is defined by the shared beliefs, attitudes and practices that determine the performance of an organization’s safety and health management. As it turns out, every organization has a safety culture—whether it is good or bad, healthy or weak. Even employers with the best intentions may say they value safety in the workplace, but are unable to provide the proper resources, training and communication needed to fully engage their employees to become involved. In turn, when employers do not engage workers in the process of building a safe culture, employees may not be able to recognize an unsafe work environment or feel comfortable speaking to their managers about existing safety risks.
It’s no question that workplace safety should be a top priority, but organizations need to keep in mind that they will see the greatest success when everyone in the workforce is driving the commitment. Here are four steps organizations can take to ensure a strong safety culture:
Akamai’s Third Quarter, 2015 State of the Internet Report had a bit of good news and a bit of bad news. As usual, the report offers a lot of numbers. Global connectivity speed increased a very small amount -- 0.2 percent -- to 5.1 Megabits per second (Mbps) from the second quarter. However, the gain represented a far more impressive 14 percent year-over-year increase.
Another bit of mixed news was found in average peak connection speed. It dipped a bit – 0.9 percent – to 32.2 Mbps from the second to the third quarters. That followed, however, an increase of 12 percent during the second quarter compared to the first. The year-over-year growth for the third quarter was 30 percent.
Highlights were noted for Singapore (a 25 percent speed increase to 135.4 Mbps) and Macao (an 18 percent increase to 73.7 Mbps). Singapore remained atop the international listings. The firm found that about 15 percent of the world has broadband connections that are 15 Mbps, which the company rates as “4K ready. This represents is a 5.3 percent increase from the second quarter.” In the U.S., 10 states had 10 percent or more unique IP addresses operating at speeds of 25 Mbps or higher.
Embarrassing – or inevitable? How you view a failed security audit, whether in IT or at an overall organisational level, depends on whether you think security is a result or a process. There is a fundamental difference between the two points of view. In addition, current trends suggest that security is becoming less of an achievable state, and more of a continual improvement. Surveys confirm that many organisational executives consider that security breaches are no longer a question of “if”, but of “when”. In that case, a security audit should always “fail”. What counts is the reaction to such failure.
When an MSP asked a large agency within the State of Maryland if it could retrieve a file from six months ago with 100% confidence, the answer was no. What if the agency had to do a full system restore? What would that downtime look like? A week? A month? Even the organization's best-case estimate wasn’t sufficient, by today's RPO and RTO standards.
At the end of the day, the IT department at the Maryland government agency was looking to upgrade its legacy backup system, but couldn’t afford to make any more expensive upfront investments. This is why the agency turned to an MSP (SANS Technology) to simplify its disaster recovery and backup needs.
The Business Challenge: Finding a DR Solution That Could Protect Every OS and Every Server
One of SANS Technology’s customers, a large agency within the State of Maryland, was looking to move from tape backup to disk-based backup and protect an environment that included:
GRCCS in collaboration with the Business Continuity Institute recently carried out the first Certificate of the BCI (CBCI) graduation ceremony at the Pullman Kuala Lumpur City Centre Hotel in Kuala Lumpur, Malaysia. The 30 graduates received their CBCI from David James-Brown FBCI, Chairman of the BCI, and witnessed by GRCCS Chairman YBhg. Tan Sri Dato’ Hj. Abd Karim B. Munisar.
The CBCI graduation ceremony was the first graduation ceremony for CBCI graduates to be carried out in Malaysia, and in the world. It is an initiative carried out by GRCCS to honour the 30 CBCI graduates on their achievement of acquiring the CBCI credential this year. All 30 CBCI graduates had attended the Good Practice Guidelines Training and CBCI Exam classes in 2015 carried out by GRCCS.
The CBCI graduation ceremony was also attended by Abdul Razak Yaacob, Chief Executive Officer of GRCCS and Chong Chen Voon, Chief Operating Officer of GRCCS, Nik Khairun Nisa Nik Mohd Khalid, Executive Director of GRCCS, other Executive Directors of GRCCS and distinguished guests from various organisations from the public sector, public listed companies, GLCs, universities and private companies, and was covered by many leading media networks including TV stations and newspapers in Malaysia. Some of the key government agencies that were present at the graduation ceremony are Malaysian Administrative Modernisation and Management Planning Unit (MAMPU), Prime Minister’s Department of Malaysia, Perbadanan Putrajaya (Putrajaya Corporation), and Kumpulan Semesta Sdn. Bhd. of Selangor State’s Menteri Besar Selangor Incorporated (MBI).
The 30 CBCI graduates are from various large organisations such as Bursa Malaysia (Malaysia Stock Exchange Authority), Maybank, Sime Darby, UMW Corporation, AEON, Measat, Berjaya Group, Boustead, Malaysian Technology Development Corporation (MTDC), Matrade, Malaysia Airlines, Takaful Ikhlas, Gas Malaysia, Okachi, Berjaya University College of Hospitality, MNRB, Pengurusan Asset Air Berhad (PAAB) and Universiti Sains Malaysia (USM) which represents the public sector, public listed companies, GLCs, universities and private companies.
Tan Sri Dato’ Hj. Abd Karim, Chairman of GRCCS said, “This is certainly encouraging and shows that all sectors in Malaysia are indeed embracing BCM. It is also gratifying to see the representatives from the Human Resource, Heads of Divisions and CEOs also present today to witness the achievement of the CBCI graduates from their respective companies.”
“This is the first CBCI graduation ceremony for Malaysia. I applaud GRCCS for taking the initiative to carry out this CBCI graduation ceremony to honour and provide recognition to the business continuity professionals and trust this will encourage the future growth of the BCM ecosystem in Malaysia,” said David James-Brown FBCI, Chairman of the BCI.
Tan Sri Dato’ Hj. Abd Karim Munisar also stressed the need for business continuity to be a boardroom agenda for organizations, considering the potentially devastating financial and organizational impact of a disaster. He said "employers have the added benefit of having certified practitioners who can help towards achieving alignment or certification to ISO 22301, or to demonstrate enhanced levels of resilience which can give the organization the edge over their competition."
GRC Consulting Services (GRCCS) is an established professional consulting firm specialising in Governance, Risk and Compliance (GRC) Advisory Services. GRCCS is a Licensed Training Provider for the BCI to deliver BCI certification courses in Malaysia and China. The BCI training is based on the BCI’s Good Practice Guidelines which themselves are aligned with the ISO 22301. GRCCS also provides Human Capital Development Advisory and is the leading provider of GRC integrated software i.e. CURA software and Governance Manager Software; and Everbridge Mass Notification System in Malaysia.
InformationWeek is spotlighting the companies whose innovative solutions to technology and business challenges earned them a place on our 2015 Elite 100. For more on the program, and to see profiles of the Top 10 Elite 100 finalists, click here. If you're interested in nominating your company for consideration in the 2016 Elite 100, click here.
It's not every day that an IT project has internal business units jostling to use it. But that's exactly the situation Intuit IT had on its hands after the launch of the Intuit Analytics Cloud (IAC).
Gathering and storing data wasn't a problem for Intuit, which offers financial software and tools such as TurboTax, QuickBooks, Quicken, and Mint.com. The challenge was deriving useful insight from all its data. That's why Intuit launched IAC: to turn lakes of data into pools of information.
As more companies and large corporations move their business operations to the cloud, increased awareness for tighter security is gaining traction as well. Organizations such as the Cloud Security Alliance (CSA) have been leading the path toward a more secure cloud computing environment for enterprises.
Large multinational tech companies have ramped up their security service offerings, as in the case of IBM: in 2014, they introduced the Dynamic Cloud Security portfolio, which is expected to solve cloud security concerns related to access control, data protection and increased visibility.
But unless users remain vigilant in taking the necessary steps to secure their networks, hacking and other cybersecurity threats are a very real concern. Here’s a list of the worst threats to cybersecurity and some of the countermeasures you can implement to avoid them.
A lot has changed in a few years.
When I talked about cloud three years back, I got frownie-faces from my peers. Skeptical looks that belied a deeper-seated fear or trepidation, probably having more to do with their internal image of what a CIO should be than the promise or peril in the new technology.
Now, enthusiasm runs ebulliently through the vendor community, animating the animal spirits and spurring on entrepreneurs in search of profits and glory. Cloud has been elevated to high strategy on the billionaire chess board. Mergers and acquisitions are abuzz. Amazon, armed with an overly energetic workforce, gets hypercompetitive in all ways good and ill, supplanting Oracle as one of our most vociferous vendors and perhaps the new alpha predator. Numerous smaller vendors — tiny even in the aggregate, compared with Amazon’s might — are quickly learning the new cloud lingo, differentiating themselves from Amazon and contemplating symmetrical and asymmetrical warfare. Today it’s Everyone vs. Amazon.
(TNS) - Officials closed all Los Angeles Unified School District campuses Tuesday morning after receiving a “credible threat” of violence involving backpacks and packages left at campuses.
Authorities said they planned a search operation of all of the LAUSD’s more than 900 schools. The nation’s second-largest school district has more than 700,000 students.
“I think it’s important to take this precaution based on what has happened recently and what has happened in the past,” LAUSD Superintendent Ramon Cortines said.
Lalit Dhingra is president of NIIT Technologies’ US operations.
Transforming any part of a business can put a strain on the entire organization regardless of how well it is planned. The recent push for digital transformation is no different.
As more businesses evolve operations to be digital-first, marketing and IT departments often approach the transition from different, yet relevant, points of view that can cause tension. This is due in part because each department does not understand the intention of the other: chief marketing officers (CMOs) can perceive that IT teams don’t recognize the urgent need for integrating new data sources, though the CMO may not understand how long such a project actually takes. Likewise, chief information officers (CIOs) are forced to work within shrinking budgets which can make implementing new systems more difficult.
It seems that everyone is worrying about disruption these days, whether it’s market disruption, business model disruption, or technology disruption.
Of course, it’s always better to be the disruptor than the disruptee, so it seems that a primary strategic objective for the enterprise in the coming year is to improve the ability to disrupt others while minimizing the effects of disruption at home.
This is easier said than done, but with the advent of software-defined infrastructure and rapid application and service development, it’s no longer outside the bounds of possibility.
To be a successful data center you need a disaster plan. In other words: what to do when something major occurs and you experience a site failure?
Everyone works at planning for high availability, the failure of a single component within the data center (and making sure it doesn’t affect user experience), and everything continues to run. But what about the loss of an entire site?
Building a disaster recovery plan is like buying life insurance. With the purchase of life insurance you’re betting you will pass away before you will have paid for the policy, and the insurance company is betting you will not.
Guess what? They win more often, but that risk is something you have to take to ensure your family’s security going forward. That is what a disaster recovery plan is, you are betting and spending money that something major will occur in your data center, and spending money to deal with it.
More and more, companies are selecting colocation providers to help them manage complex data center environments, lower capital and operating costs, and shore up physical security.
These are the key findings from a recent Ponemon Institute research initiative on how companies are better managing the complexity and costs of their IT infrastructure
Top Reasons to Outsource
Managing data centers has become more complex. The Ponemon Institute’s research reveals IT leaders’ top three reasons to outsource to data center providers.
DENTON, Texas – The Federal Emergency Management Agency (FEMA) urges people to buy flood insurance now – before the next flood hits.
Flooding is the nation’s number one natural disaster, a fact people in this part of the United States know all too well. Yet statistics indicate most people ignore the risks associated with flooding and do not buy flood insurance.
However, with some forecasters calling for a wet winter in many parts of the country, local residents should buck that trend, said FEMA Region 6 officials in Denton, Texas. Those wet winter forecasts come on the heels of a spring and summer that saw Arkansas, Louisiana, Oklahoma and Texas receive major disaster declarations for flooding.
“Nobody here will forget the heartbreaking images from this spring’s devastating floods,” said Regional Administrator Tony Robinson. “Losing your family’s treasured possessions to floodwaters is hard enough; not having insurance to cover the replacement costs makes a bad situation worse.”
People who want to know whether they live in a flood-prone area and how to get flood insurance can learn more on www.floodsmart.gov. The site contains a wealth of information about the risks and costs of flooding, and the benefits of insurance.
“Once you buy an insurance policy, it takes 30 days to go into effect – so the time to act is now, before the next heavy rains,” Robinson said.
This article was republished with permission from Michael Volkov’s blog, Corruption, Crime & Compliance.
A Chief Compliance Officer can get so overwhelmed with risks that it is hard to keep their focus on priorities. Risks are everywhere and no compliance program can address every risk – the trick is keeping your eye on the ball and focusing on the significant risk.
There are lots of risks surrounding a company’s supply chain. Unfortunately, vendors, suppliers and their respective vendors and suppliers can drive you crazy when you start to calculate all the permutations. A supplier of a supplier of a supplier can create real risks for anyone in the chain.
In addressing this complex situation, a clear strategy has to be developed – predicated on defining the specific risks applicable to your supply chain.
The threat of terrorism looms over many societies and has been a considerable source of concern for professionals in the protective disciplines. The Paris attacks are still fresh in the collective memory and brings to the fore how terrorism can profoundly disrupt our way of life. The latest Horizon Scan Report by the Business Continuity Institute featured acts of terrorism as one of the top ten threats that business continuity professionals worry about for the fourth year running – a sign of such lingering concern.
Terrorist acts confront our fundamental sense of security and therefore involve our emotions. Our emotions, for better or for worse, influence our judgments on risk and how we carry on with our lives. As our societies respond to this continuing threat, that tension between intellect and emotion is also played out. Given our role in the protective disciplines, we need to be aware of our personal judgments on risk which influence our professional decisions.
In the latest edition of the BCI's Working Paper Series, Tim Jordan captures this tension quite well as he discusses implications to the understanding of risk from a practitioner’s perspective, highlighting that terrorism is a persistent phenomenon. This is an important premise as it influences business continuity, the way we analyse the business impact of certain risks and our responsibility in making our organizations resilient.
The Paper concludes that the issue of managing risks associated with terrorism is complex and not easy to conceive. It is also not a topic which lies within the bounds of business continuity, risk and resilience. Nevertheless, our profession faces terrorism risks and the organizations we work for are affected directly or indirectly by terrorist acts. Therefore business continuity, risk and resilience practitioners should have a sound understanding of the issues and their accompanying effects.
In the end, terrorism and its effects influence the perception of risks and individual feelings. Business continuity, risk and resilience practitioners are not free from these effects. Given their important role, they are in a position where their tasks require them to critically examine their environment in a more considered way.
To download your free copy of ‘Terrorism as a lasting threat and its implications to practitioners’ view on risk', click here.
Replicating data across a wide area network (WAN) is generally considered too expensive and time consuming to be taken lightly by most IT organizations. The rise of Big Data naturally exacerbates that challenge.
For that reason, WANdisco created replication software for Hadoop environments that makes sure all the servers and clusters deployed across multiple date centers are fully readable and writeable, always in sync, and recover automatically from each other. Now WANdisco is extending the capabilities of the core WANdisco Fusion Platform via six plug-in modules that address everything from disaster recovery to replicating Hadoop data into the cloud.
Jim Campigli, chief product officer for WANdisco, says that as Hadoop deployments become more distributed, IT organizations are going to need to actively manage multiple deployments of Hadoop clusters. To address that issue, Campigli says many of them will need to find a way to cost-effectively keep Hadoop clusters synchronized with one another across a WAN.
(TNS) - The Newport News Community Emergency Response Team has figured out how to make a good deed even better. They turned an annual food drive into a training exercise for distributing disaster relief supplies.
Trained volunteers may be called to assist with various efforts in the aftermath of a disaster. One of those roles is to set up and manage points of distribution sites. These sites provide residents with items like water, tarps and food if damage from a disaster prevents stores from selling them.
But it's a challenge for the teams to practice the work because you have to have items to hand out, according to Dana Perry, emergency operations coordinator with the city's Division of Emergency Management.
Just a friendly reminder to all, be extra diligent when opening emails this time of year!
Hundreds of thousands of computers become infected from phishing emails appearing at first glance to be legitimate, and these email appear more than ever during the Holiday Season. Phishing is a form of online identity theft in which fraudsters trick users into submitting personal information to illegitimate web sites. Below is a list of items (from our friends at TechRepublic) that can help you identify phishing emails. If you receive one, just simply delete it permanently by holding down the shift key while pressing the delete key. This will help protect your computer as well as the company’s assets.
We know that 911 call centers frequently receive imprecise locations of callers from wireless carriers -- and some don’t get any location information at all. Calls from landline phones are linked to addresses. But today more than 70 percent of all 911 calls originate from cellphones, a number only expected to increase.
More reliable location information could save lives, and earlier this year an order from the Federal Communications Commission (FCC) set targets for companies to improve both the availability and accuracy of location information. But those upgrades remain a long way off.
Under the new rules, carriers will have to provide caller location info within 50 meters 80 percent of the time by 2021, along with vertical location information -- is the caller in the basement or on the 22nd floor? -- that would have to be in place in top markets by 2023.
Ah, nothing goes easy with that $67 billion Dell-EMC deal, does it? Today the plot thickened a bit more when VMware announced in a filing with the SEC that it was walking away from the agreement with EMC to form Virtustream as a jointly owned company. Re/code first reported this news.
This whole deal has from the start been a fairly complex tale, and like the movie Groundhog Day, one we seem to be telling over and over each time a new bit of news comes our way. It takes some background, so strap in while I explain the complications on top of the complications in this merger. If you haven’t been following along at home, you may want to take notes.
For starters, EMC owns 80 percent of VMware, but the company operates as a separate entity with its own board of directors and separately traded stock. A couple of weeks after the Dell-EMC merger announcement, EMC and VMware decided to throw a little wrinkle into the deal, announcing they were forming a separate jointly-owned company called Virtustream from the company EMC had purchased in May for $1.2 billion. The companies announced further that in spite of the 50/50 split, the Virtustream’s financial results would be included on VMware’s books.
Ideally, neither one nor the other would happen. However, events over the past weeks have proved how uncertain times are. The attacks in Paris in November 2015 led a few days later to the “Brussels lockdown”, in which the entire capital city of Belgium and home of many European government institutions shut down overnight. News channels showed video footage of deserted streets, in which bars, cafes, restaurants and shops remained shut. That of course meant interruption of business for many enterprises. However, in business continuity terms, certain dramatic circumstances have ended up doing far greater damage in the past.
This Christmas, wearable tech is projected to be hackers’ next big target and there’s also more data at risk. Straight from the experts themselves, here are some ways to make sure you can have the best chance at keeping your connected device from turning on you.
1. You might not want to get the first generation. “With new platforms, we don’t know the vulnerabilities until some time has passed. Users can also make sure they have the most recent version for operating devices and when you get prompted for a software update, do it as soon as possible.” — John Herrema, Good Technologies.
2. Think hard about what data can be potentially taken. “Think about the worst and assume that the data could somehow get out and then ask yourself if you can truly tolerate that or not.”
Hiring new employees is always a positive sign for a growing business, but while expansion is exciting, it doesn’t come without its headaches.
While HR departments will know all too well how stressful it can be to make sure they’re hiring the right person, IT administrators often also find themselves facing difficulties with getting new starters on board – even if their role is sometimes overlooked.
After all, without the right level of access to key technologies, applications and services, they will be unable to perform their job effectively. And it’s not just for productivity reasons that IT has a role to play – they also need to make sure that the new employee is using their systems responsibly and not exposing the business to problems such as security breaches or data loss.
Therefore, it’s important that IT administrators understand what to do when new starters are getting set up. Read on to learn our top tips and best practices for technology leaders when it comes to this process, and find out what key questions you need to be asking in order to make things run smoothly.
90% of large businesses report experiencing major IT incidents throughout the year, and 60% report outages occurring on a monthly basis, yet only about half have a team dedicated to handling such occurrences. This is according to a study conducted by Dimensional Research on behalf of xMatters.
Major Incident Management Trends 2016 also revealed that nearly two-thirds of IT departments have target resolution times when an outage occurs, but three-quarters of them routinely exceed their target times.
Reliance on digital infrastructures has dramatically increased the impact and frequency of major incidents, according to the report. IT and business leaders within individual companies are mostly aligned on what constitutes major incidents and how to resolve them. However, standard definitions and processes are lacking between companies and across industries. Without these standards, IT departments lack benchmarks and best practices to help drive improvements.
According the Business Continuity Institute's annual Horizon Scan Report, IT and telecoms outages have consistently been one the top three threats to organizations with the latest report highlighting that 81% of business continuity professionals expressed concern at the prospect of this kind of threat materialising.
“At long last, IT departments and business leaders are on the same page when it comes to recognizing the severity of business impact during a major incident and the importance of solving disruptions as quickly as possible,” said David Gehringer, principal at Dimensional Research and author of the study. “However, they’re unfortunately falling far short of their goals of solving problems on time and in an efficient manner, often due to poor alerting and communications management.”
“The survey findings show both enterprise IT teams and business leaders have come to grips with the occurrence of major incidents and IT outages, but insist on effective communications. In terms of business stakeholder frustration, we found that lack of effective communication trumps occurrence of incidents in the first place,” said Randi Barshack, CMO of xMatters.
By French Caldwell, Chief Evangelist at MetricStream
Major hacks have raised the profile of cybersecurity programs from the basement of IT operations to the board room – but cybersecurity alone is not enough to manage information risks. In the last few months we have seen a hack that is so mammoth that there is no way to do a full loss event analysis. The mammoth hack of security clearances and background information of 22.1 million current and former federal employees from the U.S. government’s Office of Personnel Management provides the hacker, alleged to be China’s military intelligence, access to the personal information and clearance data of people with access to the classified programs and information that is essential to national security. It provides the hackers with the ability to run exploitation operations potentially for decades. This is so much worse than the Snowden leaks that it is difficult to understand why such data was even available through the unclassified network connected to the internet.
(TNS) - The answer to how Pierce County’s new 911 system is working out depends on who’s asked.
Some emergency dispatchers and law enforcement officers say the new computer system — which, according to South Sound 911, is supposed to make it easier for agencies to work together — is in some cases making their jobs harder.
Administrators of South Sound 911 say the more than $5 million upgrade has important new features and that users need to give it time.
“Part of what we need to do is be open-minded about new ways of doing things,” agency director Andrew Neiditz said. “What we’ve come from was pretty unacceptable.”
Do your customers understand the cloud? Recent data indicates confusion remains problematic for many cloud users.
In addition, the survey showed cost was one of the top concerns for companies considering a change to their IT infrastructure.
So how can managed service providers (MSPs) ensure their customers understand the cloud and its benefits?
The world of disaster recovery has made some quantum leaps over the last few years. Cloud computing, in particular, is helping companies of all sizes migrate applications from onsite systems to hosted environments accessible through the Internet. The cloud is now enabling organizations to safeguard critical resources from potential disruptions--whether they be micro (for example, human error, UPS battery failures or equipment failures) or macro (such as site-wide failures caused by natural disasters).
Historically, companies looking for on-demand (or hot site) failover were required to invest heavily upfront in capital equipment as well as ongoing operational expense (excluding the staff needed to run the operation following a disaster). Based on recent research from Windstream Hosted Solutions, you can expect to spend more than $300,000 over three years to protect a minimum 2 terabytes of data stored on five mission-critical servers.
Worse still, $85,000 of this total expense needs to be paid upfront in capital equipment and infrastructure before your DR environment can even go live. And these costs can be driven significantly higher if you don’t have an IT team versed in DR recovery, leverage VMware extensively, or don’t have an up-to-date and regularly tested recovery plan.
Collapsed buildings, damaged factories or destroyed shipping containers: Whenever natural catastrophes or man-made disasters strike, the physical damage is often devastating for companies. However, the less obvious economic impact from business interruption (BI) is often much higher than the cost of the actual physical damage and presents a growing risk to operating in an increasingly interconnected world.
The Global Claims Review 2015: Business Interruption In Focus report from Allianz Global Corporate & Specialty notes that BI now typically accounts for a much higher proportion of the overall loss than was the case 10 years ago. The average large BI property insurance claim is now in excess of €2 million, which is 36% higher than the corresponding average property damage claim of just over €1.6 million.
Both severity and frequency of BI claims is increasing, which are mostly caused by non-natural hazards such as human error or technical failure rather than from natural catastrophes. The top 10 causes of BI loss account for over 90% of such claims by value, with fire and explosion being the top cause, accounting for 59% of all BI claims globally. The top ten of causes of business interruption ranked by value were:
- Fire and explosion
- Machinery breakdown
- Faulty design/material/manufacturing
- Cast loss (entertainment)
- Human error/operating error
- Power interruption
“The growth in BI claims is fuelled by increasing interdependencies between companies, the global supply chain and lean production processes,” explains Chris Fischer Hirs, CEO of AGCS. “Whereas in the past a large fire or explosion may have only affected one or two companies, today, losses increasingly impact a number of companies and can even threaten whole sectors globally."
Interdependencies between suppliers can be a big unknown and many businesses are dependent on key suppliers. Business continuity planning should not only be part of a company’s own supply chain management programme, but should also be extended to all of its critical suppliers. It is important that supply chain management is treated as a cross-functional task involving at least functions such as procurement, logistics and finance.
The growing risk to supply chains is something that was also highlighted in the Supply Chain Resilience Report, published by the Business Continuity Institute, which revealed that a tenth of organizations are not aware of who their key suppliers are, a finding that is alarming given that 74% had suffered at least one supply chain disruption during the previous year, and half of these occurred below the tier one supplier.
While Microsoft is behind Amazon in public cloud, it has no need to play catch-up inside the enterprise data center. That combined with the second-largest public cloud business puts it in a good position to dominate in hybrid cloud, which is touted overwhelmingly as the cloud strategy of choice among enterprises.
The only other players with existing presence in enterprise data centers similar in scope are VMware and IBM. Of the two, IBM may be the harder one for Microsoft to compete with in hybrid cloud, since it also has made massive investments in public cloud, while the scale of VMware’s public cloud infrastructure is quite small in comparison to the other players.
The hybrid cloud opportunity is enormous. Cloud was the number-three 2016 investment priority for CIOs who participated in Gartner’s latest global survey, following business intelligence and analytics (their first priority) and infrastructure and data center (their second).
Several cybersecurity experts are predicting cybercriminals increasingly will target Apple (AAPL) devices in 2016.
What can managed service providers (MSPs) and their customers learn from these IT security newsmakers? Check out this week's edition of IT security stories to watch to find out:
According to a recent survey of IT decision makers at small and midsize businesses (SMBs), business continuity (including data protection and recovery) was identified as a top IT challenge. So, if your customers are starting to evaluate new disaster recovery solutions to address this challenge, here are some facts to help with their 2016 disaster recovery planning.
1. No. 1 Cause of SMB Downtime Isn't What You Think
While many still think of natural disasters as the top cause of downtime, industry data indicates that hardware failure and human error are far more common. And these micro disasters are a blind spot for SMBs.
In a new report from ActualTech media, 79% of midsize companies (500 to 999 employees) couldn’t recover from a hardware failure in minutes, leaving companies in a “fix first, run later” mode. Read the full 2015 Disaster Recovery as a Service Attitudes & Adoption Report to learn more about your clients’ disaster recovery capabilities.
In the blink of an eye, the year is almost over. In looking back at what it meant for the cybersecurity industry, 2015 was predictably busy. We saw big acquisitions, including those of EMC by Dell and Websense by Raytheon. Rapid7 and Sophos both went public. Large funding rounds happened almost weekly, with the sector raising more than $2.3 billion in the first nine months.
Cybersecurity spending increased sharply and should cap out at about $75 billion by year’s end, according to leading analyst estimates. While the U.S. House and Senate continued to debate cybersecurity legislation, government agencies amassed a whopping security budget of $12.5 billion, collectively.
There were unforgettable breaches, like Anthem, BlueCross BlueShield and the U.S. Office of Personnel Management, although the biggest headlines went to the Ashley Madison breach. There also were countless daily reports of breaches due to “sophisticated attacks” and resulting losses from companies whose infrastructure — despite all the spending — remained woefully vulnerable. Even President Obama stepped into the fray, cementing an agreement with China in the hope of limiting the scope of nation-state hacking.
Internet-based technology and services are expanding with such speed that security has been left behind. As the Internet evolves at an untethered pace, hackers are iterating just as rapidly as the innovation. This has left us with a technological void that is being all too easily exploited, leading to a lack of clarity on how to effectively mitigate the risk from a corporate governance perspective.
On November 10th, federal prosecutors announced charges relating to last year’s JPMorgan Chase hack. In what it referred to as, “the largest cyber hacking scheme ever uncovered,” prosecutors detailed how hackers stole information from over 100 million individuals and hacked into over 12 different organizations, seven of which were financial institutions.
“The breaches of these firms were breathtaking in scope and in size,” said Preet Bharara, the U.S. attorney for the Southern District of New York. “The conduct alleged in this case showcases a brave new world of hacking for profit.”
Everyone knows the fable of the tortoise and the hare. But in business, forget the moral of the story. When it comes to file and data delivery methodologies, companies will always bet on the hare, never the tortoise. Especially when business relies on software solutions to improve the speed and accuracy of data-driven decisions and confident execution ahead of the competition, this only makes sense. On this point, forward-thinking organizations constantly seek better approaches to move and manage information throughout integrated systems and to geographically disperse endpoints across complex business networks in the fastest possible way.
Why Speed Matters
In theory, faster data movement means business moves faster. Increasing the efficacy of business processes and operations through accelerated transfer speed is an effective pathway to increasing turnaround. That, in theory, spells a quicker ROI coming from new software or technology designed to facilitate rapid data movement.
High-speed file transfer signifies an optimal capacity to rapidly send large files to customers and other trading partners under strict time mandates. After all, time is money. In today’s business, if the data wasn’t important enough to get wherever it’s going quickly, it probably didn’t need to be sent at all.
Are small to medium sized businesses (SMBs) prepared for a cyber attack? Not according to a new study by Webroot which indicated that just 37% of IT decision makers surveyed in the US, the UK and Australia believe their organizations to be completely ready to manage IT security and protect against threats. Furthermore, many believe they lack the resources needed to protect themselves against malware attacks.
Are organizations completely ready to stop cyber attacks? highlighted that within the majority of SMBs, IT teams are expected to handle all cyber security management and concerns, with IT employees at nearly a third of companies (32%) having to juggle security along with other IT responsibilities. This leaves employees stretched thin and unable to devote the necessary time to many critical cyber security tasks.
"SMBs play a pivotal role in helping drive the economies of all the countries polled, but past experiences have taught them they face an uphill battle when it comes to cyber security," said George Anderson, director of product marketing at Webroot.
Defending a company from cyber attacks is inherently challenging, and made even more so by budgetary constraints. The vast majority of SMBs do not have security budgets remotely comparable to those of large (and previously breached) organizations such as J.P. Morgan, Target and Anthem. In fact, according to the study, nearly 60% of respondents think their business is more prone to cyber attacks because they have too few resources for maintaining their defences.
It is important that all organizations have plans in place to deal with a potential cyber attack as the latest Horizon Scan Report published by the Business Continuity Institute revealed that they are the number one threat according to business continuity professionals, with 82% of respondents to a survey expressing concern at the prospect of one occurring.
IT decision makers can point to specific areas in which they feel underprepared. According to the survey, almost half (48%) think their company is vulnerable to insider threats, such as employees. Following that, 45% believe they are unprepared for unsecured internal and external networks, such as public wifi, and 40% for unsecured endpoints, such as computers and mobile devices.
Mecklenburg County will soon offer active shooter training to its 5,300 employees -- a step County Manager Dena Diorio said Thursday is necessary to ensure workers are better prepared should gunfire erupt in a government building.
The measure comes a week after a radicalized husband-wife duo opened fire on a holiday party full of county employees in San Bernardino County, Calif., killing 14 people and wounding 21 others.
The slayings have struck a chord with local government employees, who occasionally face a disgruntled public and co-workers.
"It really brought to the forefront to me that we really need to make sure all the employees ... are prepared if the unthinkable happens here," Diorio said during a morning news conference. "The fact that this was a county building ... with county employees who were working inside really was an indication to me that while we think it could never happen here, you can never be too sure."
(TNS) - Response time from law enforcement in the recent shooting in San Bernardino was around four minutes, according to reports from law enforcement. Both San Bernardino County Sheriff’s Deputies and San Bernardino Police Officers swarmed the site of the shooting in a matter of minutes.
Shasta County law enforcement says the same could not be said if there were a shooting in the North State. Compared to Southern California, Shasta County has substantially fewer law enforcement officers, and depending on where those officers are within the county, response times could be stretched well past four minutes, local law enforcement officials said.
But Shasta County Sheriff Tom Bosenko says his department does not lack the resources to combat an active shooter.
For the past month, Chipotle Mexican Grill has been mired in a food safety crisis. An e. coli outbreak linked to Chipotle has sickened at least 52 people in nine states. In a seemingly unrelated outbreak, 120 people in Boston – most of them students at Boston College – also fell ill after contracting norovirus from eating at the quick-service chain.
While food safety and product recall concerns are always a major liability for industry players, the spate of infections poses even more of a threat to Chipotle as the company has built its reputation on the foundation of a healthy, responsible supply chain, boasting its use of fresh produce, meat raised without antibiotics, and a network of hundreds of small, independent farmers. As Bloomberg put it, the company’s biggest strength is suddenly its biggest weakness. Given the chain’s 1,900 locations and the rate at which it has expanded (about 200 new locations every year), its supply chain is already under significant pressure. When an audit found unacceptable practices earlier this year, the company suspended a primary pork supplier, pulling carnitas from the menu at about a third of its restaurants nationwide. The company pointed to its decisive action as proof of its commitment to sustainable agriculture, but many analysts said it highlighted the company’s inherent vulnerability to supply chain issues.
“You can never eliminate all risk, regardless of the size of suppliers, but the program we have put in place since the incident began is designed to eliminate or mitigate risk to a level near zero,” Chris Arnold, the company’s director of communications, told Bloomberg.
Figuring out exactly what any application needed in terms of storage capacity over the long haul was historically more art than science. The trouble was that given the high margin for error, a lot of organizations routinely overprovisioned the amount of storage they required. After all, it’s generally less of a sin to spend too much on storage than it is to see application performance suddenly drop for one unexplained reason or another.
As of this week, however, Tintri says it is looking to take the guesswork out of storage capacity management on its arrays via the preview of a predictive analytics application the company will make available next year.
The IT industry is in a significant period of transition, and the infrastructure landscape has changed a great deal. There are many options today, and the number of options will grow over the next two years. Having more options can more lead to complexity and potential limitation. As you assess your options you need more information and context, so you can make the right choices and avoid problems down the road.
Software defined infrastructure (SDI) has made it possible to create these new categories of products. In addition to traditional rack and blade servers and SAN storage, there is converged infrastructure, hyper-converge infrastructure and now composable infrastructure. As you evaluate these new infrastructure options, one of the most important considerations is choosing the right management software to support these products. You don’t want to add to complexity by creating islands of infrastructure that need to be managed separately.
CAMP MURRAY, Wash. – Washington’s devastating 2014 and 2015 wildfire seasons put vast areas of the state at risk of erosion and flooding, posing additional dangers to residents and communities. Today, a collaborative effort among all levels of government is finding ways to reduce that risk.
On Dec. 15-17, 2015, the Washington Military Department’s Emergency Management Division (EMD) and the Federal Emergency Management Agency (FEMA) will host a three-day workshop in Wenatchee to address topics such as assessments of burned areas in Eastern Washington, efforts already undertaken to reduce threats, analysis of unmet needs, and potential funding sources for new efforts to protect people and infrastructure.
The workshop will bring together partners on the Erosion Threat Assessment/Reduction Team, or ETART, a group first formed following Washington’s Carlton Complex Fire of 2014 and reactivated following the Oct. 20, 2015, federal disaster declaration for this summer’s historic wildfires.
Federal participants on the ETART include FEMA, the U.S. Army Corps of Engineers, the National Weather Service and the Natural Resources Conservation Service, among others.
The state and local partners include Washington EMD, the Washington State Conservation Commission, Department of Natural Resources, Department of Fish and Wildlife and the Okanagan and Whatcom conservation districts.
ETART relies on reports and assessments developed by various Burned Area Emergency Response (BAER) teams. BAER is a process created by the U.S. Forest Service and modified and used by several local teams to determine erosion risks and recommend appropriate treatments.
“When the land is stripped of trees and other vegetation by fire, healthy roots that soak up rainwater are lost,” said Anna Daggett, FEMA’s ETART coordinator. “Even moderate rain on burn scars can cause flash flooding or debris flows that can severely damage infrastructure, homes and businesses downstream.”
After the president issued a major disaster declaration for the 2014 Carlton Complex Fire, FEMA’s Public Assistance program provided about $2.4 million in grants targeted specifically for ETART-identified projects to reduce immediate threats of significant additional damage to improved public or private property. The federal share amounted to 75 percent of the total cost of $3.2 million for these projects. The state and local partners covered 25 percent, or $800,000.
ETART assessments provided important information to EMD and FEMA to justify these grants.
The 2014 measures were able to reduce substantially the effects of the wildfires by clearing culverts and ditches of debris, installing straw wattles to counter erosion, shoring up breached pond dams, and spreading grass seed over vast areas that had burned.
“ETART has shown to be an effective way to address post-fire dangers,’ said Gary Urbas, EMD’s ETART coordinator. “Merging the work of so many experienced partners allows the team to tackle tough problems in our state.”
ETART now will be looking for additional financing streams, including other FEMA and federal programs as well as local and state sources, with the goal of significantly reducing damages resulting from post-fire flooding and erosion in Eastern Washington.
More information about the PA program is available at www.fema.gov/public-assistance-local-state-tribal-and-non-profit and on the Washington EMD website at http://mil.wa.gov/emergency-management-division/disaster-assistance/public-assistance.
Additional information regarding the federal response to the 2015 wildfire disaster, including funds obligated, is available at www.fema.gov/disaster/4243.
AUSTIN, Texas – Most Texans who have registered for disaster assistance from the Federal Emergency Management Agency (FEMA), following the October severe storms, tornadoes, straight-line winds and flooding, will receive an automated phone call from the U.S. Small Business Administration (SBA).
FEMA grants may not cover all damage or property loss. Private insurance and low-interest loans from the SBA are major sources of additional funding for disaster recovery.
The recorded message gives instructions on how to request an application for a low-interest disaster loan. Loans are available to help disaster survivors – including businesses, private non-profits, homeowners and renters with recovery efforts – in their recovery efforts.
Businesses of all sizes and nonprofit organizations may borrow up to $2 million to repair or replace damaged or destroyed real estate, machinery and equipment, inventory, and other business assets.
SBA also offers Economic Injury Disaster Loans (EIDLs) to help meet working capital needs caused by the disaster. EIDL assistance is available to businesses regardless of any property damage.
Disaster loans up to $200,000 are available to homeowners to repair or replace damaged or destroyed real estate. Homeowners and renters are eligible for up to $40,000 to repair or replace damaged or destroyed personal property.
SBA provides one-on-one assistance to disaster loan applicants at any of the Disaster Recovery Centers in the affected area. Additional information is available online at sba.gov/disaster or by calling SBA Customer Service Center at 800-659-2955. Deaf and hard-of-hearing persons may call 800-877-8339.
To be considered for all forms of disaster assistance, SBA encourages survivors to first register with FEMA online at DisasterAssistance.gov or by phone (voice, 711 or relay service) at 800-621-3362. TTY users should call 800-462-7585. The toll-free lines are open 7 a.m. to 10 p.m. seven days a week. Multilingual operators are available.
The presidential disaster declaration of Nov. 25 makes federal assistance available to eligible individuals and business owners in 16 counties: Bastrop, Brazoria, Caldwell, Cameron, Comal, Galveston, Guadalupe, Hardin, Harris, Hays, Hidalgo, Liberty, Navarro, Travis, Willacy and Wilson.
After VMware purchased Desktone and Amazon announced their cloud workspace offerings in 2013, industry analysts have been looking forward to the year of "DaaS (Desktop as a Service).” But the tsunami of sales has yet to hit our shores. Now there’s strong speculation in the market that Microsoft will release its own Desktop as a Service (DaaS) product next year. Could Microsoft’s entrance into the market make 2016 the year of DaaS?
If independent numbers aggregated by Clarity Channel Advisors are any indication—and I believe they are—then the answer is absolutely “yes.” What's more, the numbers also give us insight into why Microsoft would push their own DaaS platform.
Before getting to what the number reveal, here’s some background on where they come from. Hint: it’s mostly from companies like you.
Microsoft has bought property in Texas where it plans to build a massive data center campus over the course of five years.
As it continues to grow its cloud services business, Microsoft has been expanding the data center capacity to support those services around the world at a rapid pace. Global data center construction has been viewed as an expensive arms race with its chief competitor in cloud, Amazon Web Services, as companies spend billions of dollars to improve the quality of their services to users and increase the amount of locations where they can store their data and virtual infrastructure.
Microsoft announced a multi-site expansion initiative in Europe last month, and in September said it had launched three cloud data centers in India. Amazon in November announced it was preparing to bring online cloud data centers in the UK and South Korea.
It has been a year since the 2014 “Snovember” blizzard that buried parts of Buffalo, N.Y., in up to seven feet of snow and resulted in 14 deaths. In response to that disaster, Buffalo’s Office of Homeland Security/Emergency Management (OHS/EM) has taken substantial steps to be much better prepared.
Miscommunication or resistance interfered with efforts to clear streets and town officials complained that the county failed to send plows where they were most urgently needed.
Erie County Executive Mark Poloncarz countered that town officials refused to make use of a computer-based system to coordinate the response of crews and also didn’t take part in daily conference calls.
So what went wrong? According to OHS/EM Commissioner Garnell Whitfield, Buffalo and the surrounding county were “overwhelmed” by the sheer scale of this “once in a century” snow emergency. After all, it did dump up to seven inches of snow in parts of the city over seven days. “We have a lot of new equipment and new strategies going forward,” he said.
Disasters can take many forms from weather events to database corruptions. CloudEndure, a cloud-based disaster recovery service, announced a $7 million investment today led by Indian consulting firm Infosys and previous investor Magma Venture Partners.
Today’s investment brings the total to just over $12 million.
At first blush, Infosys may seem like an odd partner, a traditional consulting firm investing in a cloud service provider, but the company was looking for a couple of different investors for this round, CloudEndure’s VP of business development, Gonen Stein told TechCrunch.
The new year will mark the dawn of the third decade of e-government. It comes as citizen and business expectations of government are being shaped by their digital lives — that is, the way they find information, buy things and request services in the wider economy.
Think of the companies that connect with you regularly in a rich and contextual way and there are probably application program interfaces (APIs) working below the presentation layer, connecting two or more discrete apps to create a better, fuller, more rewarding experience for the person looking to get stuff done.
Moving money and permissions reflect much of what government does, including core functions such as providing public assistance, licensing and the full spectrum of regulation. These are more effectively done in a bit-based world than an atom-based one. Open data makes these government actions more transparent, while throwing off data exhaust that is fueling the creation of useful things through the fledgling civic startup space and its nonprofit counterparts in the civic hacking space.
(TNS) - After months of training and set up, Baldwin County, Ga., emergency officials are ready to launch a new mass alert system, created to notify residents of emergency situations via phone and email.
In January, local EMA Director and Baldwin Fire Chief Troy Reynolds addressed the Baldwin County Commissioners about an Emergency System Grant application that would provide an emergency notification system for residents on a wide range of emergency situations that arise in Milledgeville and Baldwin County.
In August, the Georgia Emergency Management Agency (GEMA) announced that Baldwin County was awarded a $17,012 Hazard Mitigation Grant for its Mass Alert Enhancement project.
(TNS) - Federal disaster officials warned Tuesday that El Niño-fueled storms in California could inflict millions of dollars in damage this winter — from mud-soaked homes to broken levees to downed electrical lines — and said they’re taking steps to minimize the toll.
A new report by FEMA details the havoc that ensued during the strongest El Niños of the past, including the 1982-83 event that caused 36 deaths, with the aim of honing current efforts to brace for landslides, flooding and outages.
This year’s El Niño is ranked to be among the three biggest in half a century. The weather pattern is marked by above-average ocean temperatures in the equatorial Pacific that, when really warm, tend to drive moisture toward California.
John M. Hawkins is VP of Corporate Marketing and Communications for vXchnge.
Twenty years in technology may seem more like 100 years when compared to other industries. In just one year a company’s landscape can change significantly. Think about how businesses scale and operate on a functional level, then add in changing technologies along with the exponential increase of data and dynamic content needed to drive business.
Will your data center strategy survive 10, or even 20 years? Will the company grow like your key stakeholders expect? If so, you may need multiple data centers, strategically located, just to handle your requirements. On the other hand, your CFO might have a more conservative estimate and is responsible for how much is actually spent on a data center(s).
In addition to size, you have to consider whether your data center(s) might become obsolete in 5, 10, 15, or 20 years.
The need for greater energy efficiency and more capacity has put cooling systems high on the list of priorities in 2016 for IT, facilities, and data center managers in North America, according to new research from Emerson Network Power. Results show that before the end of next year, more than half of all data center cooling systems will be upgraded, according to Emerson.
That’s on top of the 40 percent of respondents that already did so in the past five years and another 20 percent in the process of doing so. While many are upgrading voluntarily, a combined 39 percent said the need to meet state energy codes or Energy Star LEED requirements were the catalysts.
The size of the data center seems to matter as 62 percent of the upgrades will occur in data centers under 10,000 square feet and 18 percent in those larger than 50,000 square feet. Inefficient cooling systems are an especially widespread problem in smaller data centers.
Storage is available at a fraction of the cost of just a few years ago, but the enterprise needs so much of it these days that the overall impact on operating budgets is largely a wash.
In fact, it seems that the adage “Build it and they will come” is no longer appropriate for the storage farm. In today’s world, “Build it and they’ll want more” is more accurate.
The proof is in the numbers. Despite falling prices, worldwide factory revenues for storage systems grew 2.8 percent in the third quarter to top $9.1 billion, according to IDC. Total capacity was up a stunning 31.5 percent to 33.1 exabytes, again in the third quarter alone. A key driver is the rise of hyperscale infrastructure, which accounted for 23.4 percent of server revenues, abetted by on-server solutions that gained nearly 10 percent. The largest share of the market (more than half) still went to traditional external storage arrays, but it is telling that this segment’s sales dropped by more than 3 percent compared to 3Q 2014.
Google is launching a new privacy tool for Google Apps Unlimited users today. The new Data Loss Prevention feature will make it easier for businesses to make sure that their employees don’t mistakenly (or not so mistakenly) email certain types of sensitive information to people outside of the company.
Businesses that subscribe to this plan for their employees now have the option to turn on this tool and select one of the new predefined rules that, for example, automatically reject or quarantine any email that contains a social security or credit card number. Businesses can choose from these predefined rules and also set up custom detectors (a confidential project keyword, for example). Google says its working on adding more predefined rules, too.
And as a result, Target tops this week's list of IT security newsmakers to watch, followed by VTech, JD Wetherspoon and the Canadian Radio-television and Telecommunications Commission (CRTC).
What can managed service providers (MSPs) and their customers learn from these IT security newsmakers? Check out this week's edition of IT security stories to watch to find out:
LAS VEGAS – Hyperconverged infrastructure did not exist as a concept two or three years ago. Today, it is one of the fastest-growing methods for deploying IT in the data center, as IT departments look for ways to adjust to their new role in business and new demands that are placed on them.
Gartner expects it to go from zero in 2012 to a $5 billion market by 2019, becoming the category leader by revenue in pre-integrated full-stack infrastructure products. The category also includes reference architectures, integrated infrastructure, and integrated stacks.
“Hyperconvergence simply didn’t exist two years ago,” Gartner analyst Andrew Butler said. “Near the end of this year, it’s an industry in its own right.” But, he added, the industry has a lot of maturation ahead of it, which means far from all vendors who are in the space today will still be in it a few years from now.
As hyperconverged infrastructure emerges as one of the favorite new platforms underneath applications running in enterprise data centers, a number of myths have emerged about it. Because it is new – hyperconverged infrastructure didn’t exist two years ago – it’s natural that many people don’t quite understand it and that myths perpetuate.
Gartner analysts Andrew Butler and George Weiss outlined the most widespread myths about these systems in a presentation at the market research firm’s data center management summit this week in Las Vegas. Here are some of the highlights:
SnapLogic, a company that helps connect data from legacy applications to the cloud or to a centralized internal data lake, announced a $37.5 million round today.
Investors include Microsoft and Silver Lake Waterman, the growth capital arm of Silver Lake along with existing investors Andreessen Horowitz, Ignition Partners and Triangle Peak Partners. Today’s investment brings the total to $96.3 million.
SnapLogic essentially acts as a translator for data (or streams of data) moving to the cloud or into a data lake inside an enterprise, SnapLogic CEO Gaurav Dhillon explained. “We have over 400 snaps — connectors or adaptors to various systems like Workday, Concur, SAP, Twitter, Tableau and machine protocols,” he explained.
Economic impact from business interruption (BI) is often much higher than the cost of physical damage in a disaster and is a growing risk to companies worldwide, according to a new report from Allianz Global Corporate & Specialty (AGCS).
Its analysis of more than 1,800 large BI claims from 68 countries between 2010 and 2014 found that business interruption now typically accounts for a much higher proportion of the overall loss than was the case 10 years ago.
Both severity and frequency of BI claims is increasing, AGCS warns.
The average large BI property insurance claim is now in excess of €2 million (€2.2 million: $2.4 million), some 36 percent higher than the corresponding average property damage claim of just over €1.6 million ($1.8 million), the global claims review found.
More organizations across a number of industries are looking at different ways to control storage and their data. Traditional storage solutions still have their place, but new methods are allowing IT shops a lot more flexibility in how they design their storage solutions, and flash is one of the most popular options. So is it really catching on? Is the world really going to solid-state?
Let’s examine one use case that’s been seeing a resurgence in the modern enterprise: VDI.
In the past, technologies like VDI were seen as heavy fork-lift projects which required time, resources, dedicated infrastructure, and big budgets. That has all changed with advancements within network, compute, and storage. Today, strong VDI offerings provide five-nines availability and greater scalability, as well as non-disruptive operations. With this in mind, it’s important to note that for a truly successful VDI deployment, all-flash storage should be part of the change in the VDI ecosystem. Ultimately, this will enable much higher performance for end users.
Often times, with sub-millisecond performance user experience with all-flash storage in the background is even better than the performance they had with physical devices and definitely better than VDI with spinning disks or even hybrid storage solutions. This type of technology has become one of the big change factors which now enable successful VDI deployments.
Nir Polak is CEO and Co-founder of Exabeam.
There’s one thing every heavily publicized data breach has in common: It wasn’t uncovered until it was too late. The breach at the U.S. Office of Personnel Management (OPM) in February was still active more than three months after security workers learned of it. In fact, many of them have another thing in common, preventative security measures weren’t enough to stop them.
Prevention has always been a major component of security. Firewalls stand at the perimeter of sensitive, private networks and attempt to keep every malicious file out. As the OPM breach and countless other disasters prove, though, it’s just not enough. More than 21 million records were compromised before the breach was detected in the first place. Prevention-focused initiatives have a place in cybersecurity, but there needs to be more. As we move into 2016 and confront new threats, detection needs to become an equally significant component of enterprise IT security standards. Like so many other parts of the enterprise, the answer to improving the approach to network security and eliminating disasters comes in the form of analytics derived from big data.
Pharmaceutical companies operate with a singular objective: bring drugs to market. This is how they profit, how they ensure that their products help the most people, and how they maintain the resources to continue innovating.
The lifecycle of drug development can be complex and onerous, despite improvements to the regulatory approval process over the past several years. Now, a trend sweeping the industry is forcing many pharmaceutical companies to decide under which circumstances they’re willing to divert resources from their mission of helping the masses.
Expanded Access, or “Compassionate Use,” refers to the use of an experimental drug not yet approved by the FDA to treat a critically ill patient outside of a clinical trial. The FDA received more than 1,800 requests for access to experimental drugs last year and, over the last five years, it has approved 99% of these requests.
It’s been said that in the near future the enterprise won’t need to worry about hardware – data productivity will be driven by software-defined architectures sitting atop dumb, commodity boxes.
It’s also been said that before too long the enterprise won’t have to worry about architectures or middleware either – just push everything into the cloud and let someone else deal with service provisioning.
And now we have knowledge workers accessing enterprise resources through their own preferred client devices, easing up on the requirement to supply everyone with a PC.
On 7th December 2015, the Luxembourg presidency of the Council reached an informal agreement with the European Parliament on common rules to strengthen network and information security across the EU.
The new directive will set out cybersecurity obligations for operators of essential services and digital service providers. These operators will be required to take measures to manage cyber risks and report major security incidents, but the two categories will be subject to different regimes.
Xavier Bettel, Luxembourg's Prime Minister and Minister for Communications and the Media, and President of the Council, said: "This is an important step towards a more coordinated approach in cybersecurity across Europe. All actors, public and private, will have to step up their efforts, in particular by increased cooperation between member states and enhanced security requirements for infrastructure operators and digital services".
The directive lists a number of critical sectors in which operators of essential services are active, such as energy, transport, finance and health. Within these sectors, member states will identify the operators providing essential services, based on clear criteria laid down in the directive. The requirements and supervision will be stronger for these operators than for providers of digital services. This reflects the degree of risk that any disruption to their services may pose to society and the economy.
If you don't want to send the wrong message, watch how you punctuate your texts. Text messages that end with a period are perceived to be less sincere than messages that do not, according to newly published research from Binghamton University. This finding has interesting implications for crisis communications messages.
A team of researchers led by Celia Klin, associate professor of psychology and associate dean at Binghamton University's Harpur College, recruited 126 Binghamton undergraduates, who read a series of exchanges that appeared either as text messages or as handwritten notes. In the 16 experimental exchanges, the sender's message contained a statement followed by an invitation phrased as a question. The receiver's response was an affirmative one-word response (Okay, Sure, Yeah, Yup). There were two versions of each experimental exchange: one in which the receiver's response ended with a period and one in which it did not end with any punctuation. Based on the participants' responses, text messages that ended with a period were rated as less sincere than text messages that did not end with a period.
LAS VEGAS – The business of providing colocation data center services is changing in numerous ways and for different reasons. Customers are getting smarter about what they want from their data center providers; enterprises use more and more cloud services, and the role of colocation data centers as hubs for cloud access is growing quickly as a result; technology trends like the Internet of Things and DCIM are impacting the industry, each in its own way.
Some of the trends are having a profound effect on the competitive makeup of the market, where even some of the largest players are making big strategic changes and spending lots of money on acquisitions to adjust to the new world they are doing business in.
Bob Gill, a research director at Gartner, outlined eight of the most consequential current trends in the colocation industry at the research and consulting giant’s annual data center operations summit here this week:
As cold weather sets in, clothing layers increase, scarves are pulled tighter, and noses become redder. This time of year can also bring the dreaded running nose, scratchy throat, cough, body aches, and headache of the seasonal flu. As you fretfully try to protect yourself from the winter season with warmer clothes and hot drinks, are you also taking steps to protect yourself from the bigger threat of the flu?
Flu season is coming, are you ready to fight the flu?
An annual flu vaccine is the first and most important step to preventing the flu. Everyone 6 months and older should get an annual flu vaccine. It takes 2 weeks for protection from a flu vaccine to develop in the body, so you should get vaccinated soon after the flu vaccine becomes available.
While you may be stocking up on hand sanitizer, avoiding crowded events, and distancing yourself from friends or acquaintances who let out a sniffle or two, if you haven’t gotten your seasonal flu vaccine, you haven’t taken the most important step to protect yourself from the flu.
Getting your flu vaccine is easy, having the flu is not.
Everywhere from your doctor’s office to your local pharmacy, and even the news and social media networks, are sharing important reminders about getting the flu vaccine. Getting a flu vaccine can take just a few minutes of your day. Getting the flu, however, can put you out of work or school for days, sometimes weeks. Taking a little time for your health now could save you from missing important events, work deadlines, or opportunities in the future.
Do your part for those you love.
When you get a flu vaccine, you are not only protecting yourself from the flu, but you are also protecting the people around you who are more vulnerable to serious flu illness. As the holiday season approaches, you may be around young children, older family members, or others who have a high risk of contracting the flu or developing complications from the flu.
The flu is a serious illness that can have life-threatening complications for some people. The flu causes millions of illnesses, hundreds of thousands of hospitalizations, and thousands of deaths each year. Some people, such as older people, young children, pregnant women, and people with certain health conditions, are at high risk for serious flu complications.
Get your flu shot to protect yourself and those around you. Do your part to protect the important people in your life.
Avoid germs during flu season.
While getting a yearly vaccination is the first and most important step in protecting against flu, there are additional steps you can take to avoid germs and the flu. Here are a few tips:
- Try to avoid close contact with sick people.
- If you are sick, limit contact with others as much as possible to keep from infecting them. Keep your germs to yourself.
- If you are sick with flu-like illness, CDC recommends that you stay home for at least 24 hours after your fever is gone, except to get medical care. (Your fever should be gone for 24 hours without the use of a fever-reducing medicine.)
- Cover your nose and mouth with a tissue when you cough or sneeze. Throw the tissue in the trash after you use it.
- Wash your hands often with soap and water. If soap and water are not available, use an alcohol-based hand rub.
- Avoid touching your eyes, nose, and mouth. Germs spread this way.
- Clean and disinfect surfaces and objects that may be contaminated with germs like the flu.
Don’t know where to get your flu shot?
Flu vaccines are offered in many locations, including doctor’s offices, clinics, health departments, pharmacies, and college health centers, by many employers, and even some schools. You don’t have to see your doctor to get a flu shot! There are plenty of locations available that provide vaccinations.
This Vaccine Locator is a useful tool for finding vaccine in your area.
Don’t wait until you are lying sick in bed to wish you had gotten a flu shot. There are steps you can take to prevent the flu and protect those around you. Get your flu vaccine today, and remind someone you care about to do the same. As long as flu viruses are circulating, it is not too late to get a flu vaccine!
AUSTIN, Texas – Renters displaced from their homes or apartments by the October storms may be eligible for federal disaster assistance, which may include grants from the Federal Emergency Management Agency (FEMA) and low-interest disaster loans from the U.S. Small Business Administration (SBA).
FEMA grants for eligible renters may include funds to cover the cost of renting another place to live.
Renters may also be eligible for Other Needs Assistance (ONA). ONA grants help survivors with uninsured or underinsured expenses and serious needs caused by the disaster, including:
- Child care
- Heating fuels
- Moving and storage expenses
- Disaster-related funeral and burial expenses
- Disaster-related dental and medical expenses, such as wheelchairs, canes and prescriptions
- Repair or replacement of personal property lost or damaged in the storm, including furniture and appliances, as well as job-related tools and equipment required by the self-employed
- Primary vehicles, approved second vehicles and modified vehicles damaged by the disaster
SBA offers low-interest disaster loans to help renters repair or replace disaster-damaged personal property, including automobiles. Survivors may be eligible to borrow up to $40,000, depending on their losses.
Texans in the following counties may register for disaster assistance for damage or losses sustained during the period Oct. 22 to Oct. 31: Bastrop, Brazoria, Caldwell, Comal, Galveston, Guadalupe, Hardin, Harris, Hays, Hidalgo, Liberty, Navarro, Travis, Willacy and Wilson.
Survivors can apply online at DisasterAssistance.gov or by phone (voice, 711 or relay service) at 800-621-3362. TTY users should call 800-462-7585. The toll-free lines are open 7 a.m. to 10 p.m. seven days a week.
# # #
All FEMA disaster assistance will be provided without discrimination on the grounds of race, color, sex (including sexual harassment), religion, national origin, age, disability, limited English proficiency, economic status, or retaliation. If you believe your civil rights are being violated, call 800-621-3362 or 800-462-7585(TTY/TDD).
FEMA’s mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.
The SBA is the federal government’s primary source of money for the long-term rebuilding of disaster-damaged private property. SBA helps businesses of all sizes, private non-profit organizations, homeowners and renters fund repairs or rebuilding efforts and cover the cost of replacing lost or disaster-damaged personal property. These disaster loans cover losses not fully compensated by insurance or other recoveries and do not duplicate benefits of other agencies or organizations. For more information, applicants may contact SBA’s Disaster Assistance Customer Service Center by calling 800-659-2955, emailing firstname.lastname@example.org, or visiting SBA’s website at www.sba.gov/disaster. Deaf and hard-of-hearing individuals may call 800-877-8339.
Visit www.fema.gov/texas-disaster-mitigation for publications and reference material on rebuilding and repairing safer and stronger.
U.S. property-casualty insurers face another year of disruptive change in 2016, according to a new report by Ernst & Young.
In its 2016 U.S. Property-Casualty Insurance Outlook, EY says that digital technologies such as social media, analytics and telematics will continue to transform the market landscape, recalibrating customer expectations and opening new ways to reach and acquire clients.
The rise of the sharing economy, in which assets like cars and homes can be shared, is requiring carriers to rethink traditional insurance models.
An outlook for slower economic growth, along with increased M&A and greater regulatory uncertainty, will set the stage for innovative firms to capitalize on an industry in flux in 2016.
Government is getting smarter. That’s one undeniable conclusion from a look back at the big news coming out of public-sector IT in 2015. As government assets go, leaders now realize the tremendous value of the multitude of information they hold: Indiana analyzed 5 billion rows of data to tackle its high infant mortality rate, while Chicago is using a number of data sets to prioritize restaurant inspections in the city. And others are still getting their feet wet in the analytics game. Detroit’s first open data portal launched this year, featuring more than 250 data sets.
Cloud technology continues to transform, with adoption rates ramping up across all levels of government, especially as agencies grow more confident in cloud security. Criminal Justice Information Services certifications for Microsoft in a growing number of states signal a sea change even for public safety agencies, traditionally the most reluctant to make the switch. But as police body camera programs take off in more and more jurisdictions, storage needs increase exponentially and the cloud is fast becoming an important part of the storage solution.
2015 saw more movement toward smart cities. High-profile support came in September with $160 million from the White House aimed at boosting R&D and smart city/Internet of Things projects. Carnegie Mellon University, for one, is equipping its campus with sensors, with Google’s help, and plans to eventually saturate Pittsburgh with the technology. San Francisco’s IoT network will be the largest in the U.S., and its partner plans to build nine more across the country.
Russell Senesac works in Data Center Business Development for Schneider Electric.
Twenty years ago, data centers were looked at through a Wizard of Oz tinted lens. They were a big, powerful and expensive means for data storage, but few business stakeholders outside the IT department really understood their impact – or knew what was going on behind the curtain. The digital revolution flipped this reality on its head. Today, data centers are no longer bulky cost centers, but drivers of business, enabling the data processing and availability modern enterprises need to maintain continuity and gain competitive advantage.
The Importance of Data
Data is everywhere: it is created by nearly everything – tollbooths, online transactions, instant messaging, telephone calls – and it has become earth’s most abundant digital resource. In fact, every day, we create 2.5 quintillion bytes of data. As a result, data has transformed into businesses’ greatest asset and competitive differentiator. Organizations able to quickly and effectively harness, manage and analyze data that has the opportunity to enhance customer interactions, offer more strategic solutions, evolve their services, meet increasing consumer demands, and reap huge financial and reputational rewards. As data grows more precious, data center processing power has been placed under a growing amount of scrutiny. As real-time data transmission becomes the norm rather than the exception, delays in processing can be significantly detrimental to a business’ ability to innovate.
As 2015 draws to a close, U.S. property and casualty insurers continue to adjust rates downward. The composite rate index for all P/C business placed in the United States was down 3% in November 2015, compared to down 2% in October, according to MarketScout.
“There are very few signs of rate increases. The only coverage with seemingly steady rate increases is cyber liability,” Richard Kerr, CEO of MarketScout said in a statement. “Underwriters don’t have a lot of data to use for pricing cyber so we expect pricing to be inconsistent in the near term.”
By coverage classification, property, business interruption, business owners’, inland marine, auto, umbrella, and crime coverages all adjusted down an additional 1% from the prior month. General liability and workers compensation were down an additional 2% compared to last month.
(TNS) - Thousands of Long Island homes damaged by superstorm Sandy are still not covered by flood insurance, leaving much of the region dependent on government aid if another catastrophe hits, a Newsday analysis of federal data has found.
Since the 2012 storm, the number of properties in Nassau and Suffolk counties covered by flood insurance has increased 9 percent, or by roughly 7,600 policies, according to the Federal Emergency Management Agency. That rise, however, falls well short of covering all 26,500 Long Island households that were uninsured when Sandy hit and received FEMA disaster assistance.
“There are people on Long Island who should be buying flood insurance and aren’t,” said Amy Bach, who runs United Policyholders, a San Francisco nonprofit that helps homeowners file insurance claims after disasters. “They are still depending on the government to bail them out.”
When we think of mobility, our first inclination might be to look at the hand-held device in our pocket. But that view is too limited. The concept of enterprise mobility must now extend into a strategic conversation involving the end-user, how content is consumed, how efficiently it’s being delivered, security capabilities, as well as the end-point device. Most of all, real mobility solutions must directly incorporate a good infrastructure strategy. The strategy should revolve around the company’s ability to enable and empower the workforce, giving them greater freedom of access to information and resources.
As users evolve and workloads get more complex, we’ll see an increase in data usage as well as the kinds of devices accessing the modern enterprise data center. Consider this from the latest Cisco Mobile Forecast and Cloud Index reports:
- Global mobile data traffic will increase nearly tenfold between 2014 and 2019.
- Because of increased usage of smartphones, smartphones will reach three-quarters of mobile data traffic by 2019.
- By 2019, mobile-connected tablets will generate nearly double the traffic generated by the entire global mobile network in 2014.
- Globally, data created by Internet-of-Everything devices will be 277 times higher than the amount of data being transmitted to data centers from end-user devices and 47 times higher than total data center traffic by 2018.
With all of this in mind, how do we create an enterprise infrastructure strategy that’s capable of keeping up? Where do we deploy data center components which directly enable enterprise mobility? To create real-world enterprise mobility, we have to enable our users, the data center, and the overall business. Here are some things to think about when building an infrastructure that supports greater levels of mobility:
Have you given much thought to the security of your backed-up data? If you haven’t, perhaps it’s time you do, especially in light of new research from Palo Alto Networks.
Palo Alto Networks released a white paper today that takes a close look at how bad guys are able to access backup data stored on local media, like computers, coming from mobile devices. The company identified more than 700 samples of six Trojan, adware and HackTool families infecting and hiding in both Windows and Mac operating systems. The malware has been around for at least five years. As the white paper explained:
Mobile security and forensics practitioners have been aware of the technique we describe as ‘BackStab’ for years. Rather than attacking a phone directly, BackStab involves accessing private information that was extracted from the phone through a regular backup routine and stored on a traditional desktop or laptop computer. Law enforcement officials and jealous lovers around the world have used simple tools to capture and extract private phone information from computers to which they have gained access. This includes text messages, photos, geographic location data, and almost any other type of information stored on a mobile device.
Demand for big data expertise across a range of industries saw significant growth over the past fiscal year. This is because the majority of industries have seen how dramatically their marketing strategies improve when they capture and analyze data about buyers and suppliers, products and services, as well as consumer preferences and intent.
By using big data, the retail industry has huge potential for better understanding and serving their consumers. Not only is big data a cost-effective way to gain insight of current in-store/online consumer trends and consumer behavior, it enables retailers to effectively target messaging, product creation and supply chain planning via Intelligent Analytics.
The volume and quality of available data social-network conversations, Internet purchases, and location-specific smart phone interactions has dramatically spiked. It only makes sense, then, that retailers retailing industry would adopt data-driven customization. According to a study from the Mckinsey Global Institute, retailers who embrace big data analytics yield a 60 percent boost in margins and a 1 percent improvement in labor productivity.
The enterprise is clearly poised for a dramatic increase in the use of cloud infrastructure for its standard workloads, and even some mission-critical functions as well. But concerns still linger, most of them related to the lack of visibility that occurs once data and applications cross the firewall.
According to a recent survey by Netwrix, more than 65 percent of enterprises are still concerned with security and 40 percent are worried about loss of data control in the cloud. Both of these issues, the company says, stem from the fact that organizations cannot “see” what is happening on third-party infrastructure and therefore cannot tell if they are receiving adequate protection or even the full levels of service and support they are paying for. In all likelihood, organizations will increase their dependence on the cloud regardless, but migrations would move even faster if functions like visibility and auditing of cloud agreements were improved.
Help may be on the way in the form of new visibility services, however. Intel recently announced that its new Snap data telemetry framework has been released as an open source platform, allowing enterprises to improve visibility across data center and cloud infrastructure. The system aims to improve workload scheduling and management by harnessing the full data environment under a unified platform that simplifies collection, ingestion and analysis telemetry data while supporting both machine learning and cluster control of underlying infrastructure capabilities. In this way, the enterprise is able to maintain control of its environment even as it scales into the cloud or changes dynamically to suit shifting operational needs.
Last year, 'big data' was THE buzz word. And while big data is still buzzing about, marketers today are using the term more frequently as part of their everyday conversations, at least to some extent. It is clear that big data is no longer just a snazzy new word to use when marketers talk shop, but rather a concept that has proven its worth.
However, before running head-first into the big data leagues, marketers must first pay attention to the data they already have and the rich data assets available from third-party providers. When building a house, you must have a strong foundation before choosing your granite countertops, paint colors, and flooring. The same goes for your customer data. Before learning what your customers are doing on social media, you first need to know basic information such as where they live, their email address, phone number, and other important details. By then layering new types of data onto your current customer profile, you can build a robust, in-the-moment customer picture and know what to sell them - before they even realize they are ready to purchase.
While big data certainly has its place in the overall picture, let’s take a look at all the types of data you should be collecting to really make an impact on your bottom line.
IT security flaws are now myriad, but these four stuck out like sore thumbs at the recent Black Hat Europe 2015 conference on security. Their distinguishing feature for the most part was the massive scale on which hacking could be perpetrated, either because of the number or the size of the systems affected.
Troy Gill, has analysed threat trends over the last 12 months and offers his perspective of what lies ahead...
More sophisticated malware will continue to defeat detection by hiding in common services and using non-traditional forms of communication such as TOR or peer to peer.
In tandem, recent highly effective social engineering ploys, such as those utilised in ransomware, will continue to terrorise businesses.
For all malware infections, prevention is definitely better than cure although, I personally don’t see a means to impeding infections 100 percent of the time. However, you can shrink the attack surface significantly by the following:
DuPont Fabros Technology, the wholesale data center provider that leases lots of space and power capacity to the likes of Microsoft, Facebook, and Yahoo, is expanding into Toronto, a growing data center market the company says is underserved by data center providers.
The geographic expansion is part of a broad series of strategic changes the technology-focused real estate investment trust is making.
With a new CEO on board – former NTT exec Christopher Eldredge took over the helm from DFT’s founding CEO Hossein Fateh in February – the company is now providing more power density and infrastructure redundancy options than it has before and offering full-service leases in addition to triple-net leases, which used to be its only available option. It has also stopped pursuing retail colocation business, an initiative it announced last year, choosing instead to double down on its traditional bread-and-butter wholesale data center model.
MapR is again beefing up its real-time efforts for big data with the release of MapR Streams. Here's how it works.
Big data vendors are all trying to give their customers something called "situational awareness" -- delivering systems that provide real-time insight into sales, transactions, and other data. MapR, one of the top three Apache Hadoop distribution companies (Cloudera and Hortonworks are the other two), will get closer to that goal with the release of MapR Streams.
The company describes the technology as a real-time "global event streaming system," which will be delivered as part of its MapR Converged Data Platform in early 2016.
So, your agency has reduced the number of overall data centers — mission accomplished, right?
While the Federal Data Center Consolidation Initiative has produced savings primarily due to reduction of real estate and power use — few agencies have taken the time to evaluate the performance, operational and security impact to applications and services after the move.
Server virtualization is very powerful in increasing physical resource consumption and it also provides an easy way to “lift and shift” workloads without any business rationalization.
While this reduces the number of physical servers and data centers to manage – it hasn’t necessarily reduced the amount of applications, increased performance, nor eased operational maintenance. These new challenges are becoming rapidly apparent as users, administrators and chief information officers adjust to the concept of fewer federal data centers which are farther away.
(TNS) - The South Side Area School District’s approach to school security drastically changed after the December 2012 Sandy Hook Elementary School shooting in Newtown, Conn.
The school board and district administrative team had a “candid’ discussion about security in the rural district, which doesn’t have a local police department, Superintendent Tammy Adams said.
They decided that needed to change.
“Although we had security guards, and they do a nice job with crosswalks and manning the buildings, we did not have our own police force,” Adams said. “Should something serious happen, we need someone here.”
The mission for ESF 7 – resource support – is to 'acquire personnel, equipment and other resources to support response and recovery efforts during a disaster.'
'You can't prepare for a specific incident, because we don't know what it's going to be, but what you can do is develop the ability to adapt and respond.'
Ashley K. Speed, Daily Press (Newport News, Va.)
(TNS) - Newport News Police Chief Richard Myers says there is no way to prepare for shootings like those in San Bernardino, Calif., on Wednesday, but police can train on a regular basis in order to respond effectively.
"I would say you can't prepare for a specific incident, because we don't know what it's going to be, but what you can do is develop the ability to adapt and respond — that is why we drill to practice key tactics and skills, but also instill in first responders how to adapt on the fly and react professionally, efficiently and effectively at whatever gets thrown at you," Myers said Thursday.
Fourteen people were killed and 17 were injured Wednesday after husband and wife Syed Rizwan Farook and Tashfeen Malik opened fire at a Christmas party in San Bernardino. Police in Newport News and Hampton train in order to be prepared for active shooter incidents, but also pool the resources of neighboring agencies to produce the best response.
On-server memory solutions are emerging as a key element in warehousing, data lakes and other initiatives targeting Big Data and the Internet of Things, and are even making a run at high-speed transactional processing and other more traditional enterprise applications.
But is it always the right answer for high-speed workflows? And do the multiple varieties of memory have any bearing on successful outcomes?
Like nearly all infrastructure, more advanced technologies produce better results but come at a higher cost premium. DDR4 silicon, for example, offers 50 percent more bandwidth and is 35 percent more energy efficient than its DDR3 predecessor. To date, however, DDR4 has been seen in top-end servers and desktops, although vendors like Dell and HPE are starting to trickle it down into lower-end PowerEdge and Proliant machines, says IDG’s Agam Shah. These models can be had for less than $1,000 and boost internal storage by anywhere from half to more than three times that of previous servers.
Cloud infrastructure providers appear to have found a way to convince enterprises to put some of their more sensitive data and applications into the cloud, and that way is linking customers’ servers to their own with direct, private connections, often in the same data center, without using the public internet.
While IBM SoftLayer has offered private connectivity to its cloud out of colocation data centers before, this week it announced a substantial expansion of that effort. It has partnered with several major data center and network service providers to sell this kind of cloud connectivity services in their data centers around the world.
The data center providers are Equinix, Digital Realty Trust, Amsterdam’s Interxion, and Australia’s NextDC. IBM named Verizon and Colt as partner network operators, although both offer data center services too.
Yesterday, Cisco announced a new software release for ACI. If you are looking to automate IT, or build out your cloud environment, and want to do so in an open fashion that provides a lot of flexibility – then you’ll probably be interested.
Why? The new ACI release:
- Makes managing and securing your cloud environment easier;
- Provides openness, expanding customer choice; and
- Delivers operational flexibility
So far, it seems that the ISIS attackers who carried out the November 13 terror attacks in Paris planned their attack “in plain sight” and did not use sophisticated means of encrypted communications to coordinate their attacks. The Paris attacks were traditional, physical attacks using guns and explosives, not cyber attacks. Nonetheless, officials in Western nations are seizing on the Paris attacks to promote cybersecurity measures that include censorship, weakened security standards, and militarization of the Internet. Here’s a run-down of what they have proposed.
In France, legislation that extended a temporary state of emergency for three months included “powers to carry out searches of seized devices and to block websites.” The new powers allow authorities not only to search the contents of seized devices, but also any online accounts accessible from the seized device. Additionally, the French Interior Ministry can now immediately block websites it claims are “promoting terrorism or inciting terrorist acts.” Such blocking used to require a 24 hour delay. Some have even pushed to make it illegal to visit any website the government deems to be connected to terrorism.
In the U.K., George Osborne, Chancellor of the Exchequer, announced in a speech at the General Communications Headquarters (GCHQ), that country’s equivalent of the National Security Agency (NSA), that the U.K. plans to double spending on cybersecurity. That spending will go, in part, to building a National Cyber Centre under the GCHQ. It will also fund initiatives to train students in computing, cybersecurity in particular, promote creation of cybersecurity start-ups, and improve cyber defenses at companies.
By: Ben J. Carnevale
If your organization produces and provides testing and calibration results directly to your customer, then ISO 17025 is a subject most likely already being addressed by your internal quality management system and/or risk management team(s).
However, if your organization is just beginning to study, evaluate or pursue the delivery of testing and calibration results to its customer-base, then our staff would hope that this posting would provide at least a general overview of ISO 17025 information, what benefits your organization would receive if accredited to ISO 17025, and assist in trying to determine whether or not being accredited to ISO 17025 makes sense at this time for your organization.
Having a single standard data center design whose delivery can be honed to perfection by replicating it over and over has its advantages for a developer, but it doesn’t necessarily work for every customer, and if the developer wants to branch out, change is inevitable.
Compass Datacenters, the Dallas-based wholesale data center provider co-founded by Digital Realty veteran Chris Crosby in 2012, started with the idea of delivering a standard 1.2 MW all-included single-tenant data center anywhere the customer wants. Now, three years later and several completed projects under its belt, the company is changing the design, so it can provide more capacity under one roof.
The point is to be able to serve customers Compass hasn’t been able to serve before, Crosby explained. One example would be large cloud service providers that usually need a lot of capacity in a short period of time.
Compass has already had some success in the cloud market with its first-generation design. It has built data centers for Windstream Communications – which recently sold its data center business to TierPoint – and for CenturyLink.
It’s a perennial dilemma in IT shops everywhere: So much of the IT budget is spent on keeping aging systems running that the wherewithal to invest in new, transformative technologies too often remains frustratingly elusive. So when an opportunity arose to gain some insight on how to deal with that dilemma, I jumped on it.
That insight is being shared by CGI, a global IT services and consulting firm headquartered in Montreal. CGI recently conducted a survey of nearly 1,000 of its client executives worldwide, the findings of which were included in a white paper that discusses how companies can invest in what CGI calls “step up” activities that achieve market differentiation, while maintaining the “keep up” activities that are core to running the business.
I discussed all of this in an interview last week with João Baptista, president of Eastern, Central, and Southern Europe operations at CGI, and I began the interview by citing one of the key findings of the survey: Even though business executives increasingly view IT as a means of business transformation, only 18 percent of IT budgets are allocated toward transformational investment, with the remaining 82 percent used for maintaining existing business operations. I asked Baptista for his thoughts on why that’s the case, and he said it all goes back to the legacy issue:
Global warming is the biggest threat to the world at the moment. Terrorism, wars and financial meltdown may seem like the biggest single issues, but the reality is that these are all elements that can be recovered from, whereas global warming is the cause of irreversible damage.
Global leaders over the past decade have had countless discussions to try and find the best ways to reduce carbon emissions, although given the power of the many of the fossil fuel and other polluting companies, some governments have found their hands have been tied. Even today, when faced with a mountain of evidence many in the US still don't believe that global warming exists and report it as such in popular media. As the country with the second largest total amount of CO2 released every year, it is a concern.
However, it is the role of everybody to show those who don't believe that global warming is an issue, that the impact it could have would be devastating. One of the key ways to do this is through data and modelling.
Although big data can offer clear-cut benefits to some organizations, others may be far better suited to utilizing already existing information – according to a study by Harvard Business Review entitled 'You may not need big data after all.'
Already businesses have been adapting to changing data trends. Firms know to look beyond their internal data – ‘small data’ – as it is not sufficient enough to provide valuable business insights. In our research, we found that half of business executives say that they spend more than 10 hours a week seeking business insights derived from external data.
However, not all of the data sourced from external research is actually useful. Our Insight Crisis report reveals the damage that inefficient and unguided research can have on an organization, with ineffective research predicted to cost the UK economy £14 billion a year – hardly small change.
Investment banker Goldman Sachs has been spreading the cash around lately through a series of investments in enterprise technologies ranging from big data to application infrastructure. It’s latest foray combines a new datacenter approach with an operational analytics twist.
Startup Vapor IO, Austin, Texas, which emerged from stealth mode in March, said Thursday (Dec. 3) that Goldman Sachs (NYSE: GS) led a Series A funding round that also included AVX Partners. Executive from both investment firms will join Vapor IO’s board.
Vapor IO said the funds would be used to expand its datacenter engineering and development teams along with accelerating product development. It did not disclose the amount of funding raised in the investment round.
Machine learning has been touted recently as the next big step forward for information security; however, the claims are over-optimistic says Simon Crosby.
A recurring claim at security conferences is that ‘security is a big data / machine learning (ML) / artificial intelligence (AI) problem’. This is unfortunately wildly optimistic, and wrong in general. While certain security problems can be addressed by ML/AI algorithms, in general the problem of detecting a malicious actor amidst the vast trove of information collected by most organizations, is not one of them.
Our faith in AI is based on personal experience (‘everything cloud is big data and good’) and the memes of the consumerization era. It is tempting to project this optimism into an enterprise context: The idea that it ought to be possible to sift through large amounts of data to find signs of an attack of breach is intuitively reasonable. Moreover, every IT pro managing systems at scale is aware of the value of sophisticated tools that help them to pick through large volumes of data to find relevant information to aid trouble shooting and even security investigations.
Business resilience could be so much better if we can find a way to harness the power of our people, creating a culture and behaviours that enhance business resilience.’ Robin Gaddum explains why this is the case and looks at what can be done to ‘pull the people lever’.
In my previous article, I described how business resilience is most compelling when it links performance improvement with risk management. Focus on the upside contribution to achieving future strategic objectives and the challenge becomes, 'why wouldn't you do business resilience?'
In this article I will start to explain how to make business resilience deliver on this promise, focusing first on how to unlock the power of purpose and values (see diagram below).
A lot of the conversation about inefficiency of data centers focuses on underutilized servers. The famous splashy New York Times article in 2012 talked about poor server utilization rates and pollution from generator exhaust; we covered a Stanford study earlier this year that revealed just how wide the problem of underutilized compute is in scope.
While those are important issues – the Stanford study found that the 10 million servers humming idly in data centers around the world are worth about $30 billion – there’s another big source of energy waste: data center cooling. It’s no secret that a cooling system can guzzle as much as half of a data center’s entire energy intake.
While web-scale data center operators like Google, Facebook, or Microsoft expound the virtues of their super-efficient design, getting a lot of attention from the press, the fact that’s often ignored is that these companies only comprise a small fraction of the world’s total data center footprint.
Do you remember when pocket calculators started to become popular in schools, long before you could access a spreadsheet app on your smartphone? There was a fear that they would lead to pupils being unable to do mental arithmetic, with stories of children reaching for their calculators to add two and two. Now, it’s happening again, except that this time the devices are much “smarter”, let you buy things in shops with just the press of button (or not even that), and can access and/or contain enterprise data as well as personal data. Laziness or ineptitude could even become a threat to business continuity.
The Bring Your Own Device (BYOD) trend has made the already very complex reality of corporate networks even more complicated. “Bring your own device” is part of the everyday work life in many places and it means that employees want to use popular devices such as smartphones, tablets or laptops on the corporate network. In order not to hinder productivity but, on the contrary, to increase it, many companies already support taking personal devices to work (and using them), as well as allowing access to secured corporate networks from home.
For IT administrators, this means that to the already existing challenges of network security, still more are being added. Because with the increasing use of personal devices, companies of all sizes have to contend with the problem of network bandwidth. Therefore, a company should proceed systematically and develop a reasonable strategy.
Since Dell announced its intention to buy EMC for $67 billion in October, there has been a lot of speculation about how the company was going to pay off the massive $40 billion debt it used to finance the deal.
I wrote about some of the EMC assets Dell might consider selling to help offset the huge obligation. Other rumors have had the company selling off its PC business to HP, but it turns out there could be more than one way to reduce the amount due.
Dell has a bunch of pieces that might not be in its future plans, and it could be looking to sell Quest Software and SonicWall, according to a Reuters report. These are two companies Dell bought in 2012 during a bit of an enterprise shopping spree.
In last December’s issue, I predicted that cloud computing, social engagement, government-as-a-platform and the Internet of Things would be drivers of institutional improvement and operational gains in 2015. Government Technology headlines from this year attest to significant advances made by many cities nationwide on these four fronts. The next phase of government innovation will lie at the intersection of these recent advances: To drive technological aptitude forward, municipal governments need to dial down the lag between data collection, analytical output and well-informed action.
To this end, I expect that over the next 24 months there will be substantial advances in the importance of machine learning tools that will become clearer for some of the most forward-thinking city governments across the country.
Since machine learning is a computing technique that adapts itself to changing conditions, its most common application will be to make predictions. As machine learning programs are fed more data, they learn more, and so their predictive models become more precise and produce more accurate results. The concept is not new. Machine learning algorithms, for instance, underlie Google search and Siri voice recognition. But recent improvements in cities’ data aptitude — pioneered partly by a growing cadre of dedicated municipal chief data officers and chief technology officers — have unleashed a previously unimaginable capacity for advanced data analytics. As a result, machine learning is positioned to become a powerful management tool for municipal government.
(TNS) - Stippville School, built in 1940, is said to be the first "tornado-proof" rural school in Kansas.
The school, located north of Columbus, closed in 1964, but for years residents of the former mining community used the former school, built with reinforced steel in its walls, floor and roof, as a storm shelter. The structure may be torn down soon to make room for the expansion of Kansas Highway 7.
County officials are talking about building storm shelters in Stippville and eight other rural areas of the county. The preliminary price tag is $182,500. The shelters are estimated to serve up to about 800 people.
"I'd like to see us have shelters in all these unincorporated areas," said Jason Allison, the emergency management director for Cherokee County, who drew up the proposal.
Cloud computing has made data and the processing of it more ubiquitous, efficient and accessible. Cloud-based systems are now a key part of all of our personal lives, from Internet email services to wearables and app platforms. The cloud is now moving rapidly into the healthcare sector as a way to make processes more efficient for medical providers and to allow patients to access their data at any time to keep it up to date. But for a lot of people, saving medical records and private health data in the cloud is an unnerving thought. With the recent breaches of retailers’ consumer data and headline-grabbing celebrity hacks, it’s not ridiculous to assume that anything in the cloud is relatively unsafe.
That assumption may be misplaced. Though security is still a big barrier to cloud adoption, healthcare organizations that have deployed cloud systems, whether it is through electronic medical records or other private cloud analytics services, have seen improvements in their technological capacity, financial metrics, time management, workforce productivity, and reduced security risk, according to the 2014 HIMSS Analytics Cloud Survey.
“Means exist for us to engage more and better share information, including across various care settings and geographic locations (including from the patient’s home)—all thanks to healthcare cloud computing,” said Lee Kim, director or privacy and security for HIMSS North America.
Business Continuity Plan
When business is disrupted, it can cost money. Lost revenues plus extra expenses means reduced profits. Insurance does not cover all costs and cannot replace customers that defect to the competition. A business continuity plan to continue business is essential. Development of a business continuity plan includes four steps:
- Conduct a business impact analysis to identify time-sensitive or critical business functions and processes and the resources that support them.
- Identify, document, and implement to recover critical business functions and processes.
- Organize a business continuity team and compile a business continuity plan to manage a business disruption.
- Conduct training for the business continuity team and testing and exercises to evaluate recovery strategies and the plan.
Information technology (IT) includes many components such as networks, servers, desktop and laptop computers and wireless devices. The ability to run both office productivity and enterprise software is critical. Therefore, recovery strategies for information technology should be developed so technology can be restored in time to meet the needs of the business. Manual workarounds should be part of the IT plan so business can continue while computer systems are being restored.
Resources for Business Continuity Planning
- Standard on Disaster/Emergency Management and Business Continuity Programs - National Fire Protection Association (NFPA) 1600
- Professional Practices for Business Continuity Professionals - DRI International (non-profit business continuity education and certification body)
- Continuity Guidance Circular 1, Continuity Guidance for Non-Federal Entities - Federal Emergency Management Agency, CGC 1
- Open for Business® Toolkit - Institute for Business & Home Safety
Business continuity impact analysis identifies the effects resulting from disruption of business functions and processes. It also uses information to make decisions about recovery priorities and strategies.
The Operational & Financial Impacts worksheet can be used to capture this information as discussed in Business Impact Analysis. The worksheet should be completed by business function and process managers with sufficient knowledge of the business. Once all worksheets are completed, the worksheets can be tabulated to summarize:
- the operational and financial impacts resulting from the loss of individual business functions and process
- the point in time when loss of a function or process would result in the identified business impacts
Those functions or processes with the highest potential operational and financial impacts become priorities for restoration. The point in time when a function or process must be recovered, before unacceptable consequences could occur, is often referred to as the “Recovery Time Objective.”
Resource Required to Support Recovery Strategies
Recovery of a critical or time-sensitive process requires resources. The Business Continuity Resource Requirements worksheet should be completed by business function and process managers. Completed worksheets are used to determine the resource requirements for recovery strategies.
Following an incident that disrupts business operations, resources will be needed to carry out recovery strategies and to restore normal business operations. Resources can come from within the business or be provided by third parties. Resources include:
- Office space, furniture and equipment
- Technology (computers, peripherals, communication equipment, software and data)
- Vital records (electronic and hard copy)
- Production facilities, machinery and equipment
- Inventory including raw materials, finished goods and goods in production.
- Utilities (power, natural gas, water, sewer, telephone, internet, wireless)
- Third party services
Since all resources cannot be replaced immediately following a loss, managers should estimate the resources that will be needed in the hours, days and weeks following an incident.
Conducting the Business Continuity Impact Analysis
The worksheets Operational and Financial Impacts and Business Continuity Resource Requirements should be distributed to business process managers along with instructions about the process and how the information will be used. After all managers have completed their worksheets, information should be reviewed. Gaps or inconsistencies should be identified. Meetings with individual managers should be held to clarify information and obtain missing information.
After all worksheets have been completed and validated, the priorities for restoration of business processes should be identified. Primary and dependent resource requirements should also be identified. This information will be used to develop recovery strategies.
Zahl Limbuwala is CEO of Romonet.
Data center managers are finding that their Chief Finance Officers are increasingly interested in the organization’s infrastructure. This is only natural; data centers represent a major investment for the business, and CFOs will want to be certain they’re getting value for money.
The challenge then is how to present this value to CFOs and sell them on a particular investment choice. Power Usage Effectiveness (PUE) has often been cited as the de facto indicator of data center performance, but the definition has been stretched to the point where it is almost unusable. As we know, PUE is the ratio of the energy taken in by a data center to that actually used by IT – with 1.0 being the impossible ideal. While this can give an idea of efficiency, it does not provide the full picture. Instead of focusing on abstract measurements like PUE, organizations should concentrate instead on delivering the best cost for the business.
Hewlett Packard Enterprise is in London this week for their Discover event, and it could not be more apparent that from Day One, we are no longer dealing with an HP of the past.
Gone are the days of a pure physical infrastructure play (although there is still a very strong presence) as HP Enterprise moves towards a hybrid infrastructure and transformational solutions that solve business problems. Meg Whitman’s keynote delivered a strong indication that the company is still transforming, and can take those lessons from inside to clients who are also undergoing the same transition. In fact, she specifically called out to existing and new clients that if they’re undergoing a split or divestment, or thinking of a restructure of this scale, come to HPE.
Some of the stereotypical messages around the ‘Digital Enterprise’ were present, mixed with the direction towards a hybrid infrastructure, which felt both welcome coming from HPE but I couldn’t help feel that to create a brand new legacy for themselves they could have elevated themselves above the existing noise from other vendors.
The following is a flight of fantasy, but in the wake of recent events, I believe that we would be foolish not to develop every possible technology to safeguard the lives of our children and their children.
There are many very reasonable arguments about the potential invasiveness of big data and the implications that it may have for our privacy. It is already a simple fact that we have no idea who is in possession of what data about us – the Edward Snowden revelations alleged government snooping on an industrial scale, but is this the price of our future security? A price that is worth paying?
After a few more tragic events like the ones in Paris, I think that the debate might finally come into the mainstream. Are our civil liberties worth surrendering (to a certain extent) in order to ensure that potential terrorists are foiled?
Expanding its campaign to eliminate the need for solid-state disks (SSDs) in primary storage environments, Violin Memory today unveiled extensions to its all-Flash storage platform.
Sudhir Prasad, senior director of product management for Violin Memory, says the latest additions to the company’s Flash Storage Platform (FSP) 7000 series push the entry-level cost of the arrays below the $100,000 mark, while at the same time making available a version of the array that can be configured with up to 1.4 petabytes of raw capacity.
Instead of applying the mechanics of disk architectures to primary storage, Prasad says Violin Memory took advantage of its controller technology to create arrays that enable applications to transparently make use of Flash memory directly. The end result is an all-Flash array that provides access to shared storage at rates of up to 2.2 million IOPs.
When you’re an enterprise looking to move critical, or even marginally important, workloads into the cloud, is it better to go with a cloud vendor looking to expand into the enterprise or an enterprise vendor looking to expand into the cloud?
The difference could be crucial. With the former, you get a company that is well-versed in where you want to go, but with the latter you get someone who knows where you’re coming from.
Amazon is the first name that comes to mind when pondering the public cloud, and there is a reason why it is far and away the leading provider at the moment. It has world-class infrastructure that scales beyond any other single platform, and the company has proven itself to be highly flexible and adaptable to changing market conditions. But as SiliconANGLE’s Paul Gillin notes, the company is not without flaws. For one thing, there are no bare-metal options in the Amazon cloud, which is a bit of a deal-breaker when it comes to mission-critical workloads. Also, the company has stiff-armed OpenStack, Cloud Foundry and other open solutions that support hybrid cloud architectures – so when you sign with Amazon, you’re pretty much placing yourself entirely in their hands alone for your cloud needs.
As marketers look towards 2016, building an improved mobile strategy will be a key priority for many teams. Mobile technology writer Emma Sarran Webster predicts rising consumer adoption of wearable technologies and mobile videos will lead to new delivery methods over the next twelve months. As teams navigate the new landscape of mobile advertising, the biggest pain points of 2015 are also likely to persist throughout 2016. According to VentureBeat, contemporary teams are most likely to struggle with:
● A “hypercompetitive” market due to sky-high mobile adoption rates
● App and platform abandonment on mobile devices
● Creating seamless, disruption-free mobile experiences
Creating engaging, high-return mobile marketing experiences requires accurate personalization. Currently, only 13% of marketing teams are delivering segmented mobile experiences, compared to 43% of desktop campaigns.
(TNS) - While no one really knows what this winter will bring, meteorologists are warning that Georgia could see major snowfall, floods and even tornadoes into spring.
Forecasters see a potentially record-setting El Niño on the horizon, and the impact could be disastrous.
Remember Macon's colossal 16.5 inches of snow in February 1973 and the blizzard of March 1993?
A similar El Niño pattern was in play those years and in 1998 when Georgia saw significant flooding in March.
As company computing demands change, what will the architecture that supports modern businesses and their cloud initiatives look like?
One of the hottest concepts to emerge is infrastructure convergence. We have unified architecture, converged storage, converged infrastructure, and now also hyper-convergence. But what does it all mean? How can convergence apply to your business and your use cases? Let’s take a look at each type of converged infrastructure separately.
There is no denying it cloud has moved beyond buzzword and become a key foundation element for over 40% of enterprise organizational strategies. Making the decision to “go cloud” is the easy part. Figuring out how to optimize and deliver on your cloud strategy is another discussion.
The “first wave to cloud” was focused on operational and service delivery measures such as reducing IT operational costs and improving organizational delivery to meet SLAs. However, just when you thought you had a handle on these measures, here comes the second wave of cloud.
Reducing operational costs and improving SLAs has become minimum requirements just to get a seat at the table. Being a “player” in the cloud market requires broader impact across the organization and more strategic measures of business success: innovation and increased revenue growth.
What could revamped development and operations (DevOps) mean for independent software vendors (ISVs)?
MSPmentor 501 honoree Logicalis has identified several ways the cloud can enhance the development process for ISVs. The New York-based managed service provider (MSP) noted the cloud "can facilitate DevOps changes and accelerate innovation within the software developer’s organization."
"Developers and their IT counterparts are polar opposites in their business lives," said Brian Day, senior director of cloud services at Logicalis US, in a prepared statement. "But with the help of a savvy cloud or SaaS partner, software providers are realizing significant productivity gains in the people, tools and overall culture of their organizations -- changes that give them a competitive edge in today's fast-paced software development world."
Some early-stage, high-growth companies choose to stay private instead of going public in part so that they have the opportunity to grow under the radar without intense scrutiny from securities regulators and the press. However, as Elizabeth Holmes, the founder, chairman and chief executive officer of Theranos, recently learned, staying private does not guarantee that a company will avoid becoming the target of investigative reports by major media outlets or coming under fire from business partners. If a company has not prepared in advance to weather these storms, in the worst cases, a barrage of negative publicity could potentially cripple a company’s ability to operate. As a result, even though private companies do not have the same regulatory compliance burdens as their public company counterparts, early stage companies with high-growth potential may benefit from adopting sophisticated corporate practices that will provide protection in the event they are scrutinized or challenged.
Misfortune Strikes Theranos, Once Labeled as Silicon Valley’s Darling
Theranos is a Palo Alto, Calif.-headquartered health care and medical laboratory testing company that has asserted that it has developed proprietary technology focused on disrupting blood testing. The company has claimed it has been able to use a finger-prick test to draw blood from patients instead of the traditional, more invasive venipuncture. Holmes, the chief executive officer, fits the profile of many of the most successful founders in Silicon Valley because she dropped out of Stanford at age 19 to found her own company. Media sources estimate that, since the company’s founding, the company has raised $400 million or more from investors, is currently valued at approximately $9 billion and has entered into contractual arrangements with Walgreens, Safeway and others for the roll-out of its testing sites. Because of the company’s rapid rise, some media outlets began calling Holmes the next Steve Jobs.
Black Friday is behind us and Cyber Monday is upon us. For many traditional and ecommerce retailers, today is the most significant online shopping day of the year. And it won’t stop there. Online shopping over the next few weeks will provide a significant boost to many companies’ bottom lines.
Monitoring and communicating information about IT outages and failures associated with online retail shopping can be a daunting task. At any time of the year, IT professionals are under intense pressure to safeguard the security of their organization’s data and physical facilities, and to ensure information continues flowing in the event of a disruption.
As more and more solution providers enter the Disaster Recovery as a Service market--a market that’s expected to grow to almost $20 billion by 2020--there’s growing confusion about how DRaaS differs from Bare Metal Recovery.
Let’s start with a quick definition of each and then explore key differences.
Bare Metal Recovery (BMR)
With Bare Metal Recovery, you can back up an entire physical server disk image--literally, every bit on the disks. This, in turn, allows you to restore an entire system, including the operating system and its settings, applications (including their configurations and updates), files, folders and volumes. This saves you time because you don’t have to reinstall everything from scratch.
Disaster Recovery as a Service (DRaaS)
DRaaS is the same as BMR, plus the ability to start and run that system in a virtual environment, typically on an appliance or in the cloud, in the event of a man-made or natural catastrophe. Fundamentally, DRaaS lets you “instantly” boot a protected server or your entire site and immediately get back to business, whether you’re booting from the appliance or from the cloud.
As the 2015 Atlantic hurricane season draws to a close, there’s a lot of talk about how the hurricane forecasters got it right this year, due to a strong El Niño.
Over at the Capital Weather Gang blog, Phil Klotzbach, lead author of the Colorado State University (CSU) hurricane forecasting team, writes that all of the forecasting groups predicted a moderate to strong El Niño event this year, and this turned out to be correct.
In general, seasonal forecasts did a good job anticipating a below-average Atlantic hurricane season in 2015 due to a strong El Niño event. Most seasonal forecasts predicted a bit less activity than was observed, due to a surprising warming of the tropical Atlantic during the peak of hurricane season this year.”
So what are the key takeaways?
As countless experts in the IT channel will attest, specialization is key to building a thriving business. With an increasing number of MSPs and solution providers heeding this advice, it is not uncommon to come across channel partners that are focused solely on serving clients in the healthcare, legal, banking or financial services industries. Now, those who already possess this vertical market expertise have an opportunity to differentiate themselves even further and grow their businesses by focusing on meeting a critical business need--compliance.
SMBs operating in the healthcare, financial services and other regulated industries often do not have the expertise in-house to keep on top of constantly evolving regulatory standards, such as HIPAA, FINRA and PCI DSS. And, with covered entities and business associates now sharing the risk and responsibility for security breaches and data theft, many of these businesses are entering into previously uncharted territory, which is driving the need for Compliance-as-a-Service (CaaS) offerings.
The idea of “composable infrastructure” is gaining steam throughout the IT industry, but is this really a new thing or is it simply another way to market the same modular and software-defined technologies that have already entered the channel?
In all likelihood, it’s a little of both.
HP Enterprise spelled out its vision of a composable future, dubbed “Project Synergy,” which naturally features a healthy dose of HP hardware and software all tied together with a unified API that covers functions like firmware and driver updating, BIOS configuration and network/storage provisioning. The aim is to disaggregate infrastructure to the point at which applications can quickly compile and reconfigure IT infrastructure to accomplish tasks quickly and with the least amount of resource consumption and contention.
It’s the time of year when futurists and all manner of tech analysts decide to break out the crystal ball and make some predictions for the coming months and years. Of course, I’m no different, but I don’t need the title of futurist to do it. I’ve decided rather than write one long-winded article about several trends, to break it up a little and write a series of shorter articles tackling each one I see having a major impact on both businesses (vendor and end client alike) and consumers in general.
The first off the blocks ? Blockchain.
Blockchain is something I believe will have an impact across a lot more than just Bitcoin financial transactions and the Internet Of Things. At the moment the majority of thinking behind Blockchain stems to the definition around it as a decentralised and distributed digital record, one that can only be updated by consensus of a majority of the participants in the system. Right now everything digital today is centralised, and therefore can be manipulated and hacked. Not so with Blockchain. And users remain anonymous, privacy is maintained.
There is an interesting thing going on lately in my personal life that was previously only a professional phenomenon. Today I’m going to give it the attention it deserves because I’ve learned that it is now dominating every single home electronics conversation these days.
This “thing” needs a label as I believe it represents a global opportunity for MSPs and solution providers. Today, I’m giving it an acronym because, as we all know, nothing gets solved unless it has an easy handle to latch onto. My acronym for this phenomenon is… wait for it… “IF.”
IF stands for “Integration Fatigue,” which I describe as, “the feeling of frustration that comes from knowing that if two companies actually chose to integrate their products, your life would get better, easier, smarter, cooler, faster, etc.” Examples that readily come to mind:
“In the past, the FBI wanted to operate in the shadows, but today’s Bureau is very different” said Jay F. Kramer, Supervisory Special Agent, Federal Bureau of Investigation, Cyber Division, New York Office. In an effort to make the FBI more approachable, Kramer recently provided an overview of the cybersecurity activities of the FBI at an event before hundreds of attorneys.
How does the FBI operate?
The Bureau investigates violations of federal law and significant threats to national security, making it uniquely situated to deal with today’s cybersecurity issues. In addition to being a law enforcement agency, the FBI is also a member of the US intelligence community. FBI’s mission is primarily domestic with 56 field offices across the United States, but it also has offices in 87 countries and shares intelligence and threats coming from overseas by distilling it down and packaging it at the lowest level classification possible to push it out to victims. These overseas relationships enable the Bureau to quickly respond to cyber threats by gaining access to servers, logs and data to help unravel some of these complicated cyber matters from around the world. “When it comes to cybersecurity, you’re never very far from an FBI office and from an actual person that can speak to you about issues that you’re having” Kramer said.
Here are some of the cybersecurity issues that the FBI is seeing:
The state-owned company that operates all of Russia’s nuclear plants has kicked off construction of what may ultimately become the largest data center in the country.
The project is taking place near one of the company’s plants, on a site it has previously pitched as a location for service providers with infrastructure overseas to house their servers so they can comply with the new law that requires companies to store Russian citizens’ personal data within Russia’s borders.
The company, Rosenergoatom, plans to launch the first phase of the data center in March 2017, the government-owned news agency RIA Novosti reported, citing an official announcement. At full build-out, the facility’s capacity will reach about 80 MW, which according to the announcement will make it the largest data center in Russia.
Protecting an organization from a data breach can seem daunting, and impossibly technical. But the good news is that there are some basic precautions that can help.
There is a saying in the data security community that is a bit tired, but nevertheless true: There are two kinds of companies – those that have been hacked and those that will be hacked. When clients come to us with questions about data security, it is often necessary to consult technological experts in this arena, especially when the client suspects that its data may have been breached. All too often, the client first consults us only after this suspicion has arisen. In these cases, we frequently find that basic and non-technical security best practices have been ignored. In fact, an organization can make some fairly rudimentary changes to secure its data more fully, even without information security expertise.
These suggestions will by no means insulate a company from a data breach, but they may serve to diminish the probability of one.
Organizations are beginning to understand the need for having an emergency communications plan in order to improve stakeholder communications during a crisis
CAVERSHAM, UK – A newly published report from the Business Continuity Institute (BCI) has demonstrated the need for organizations to invest in an emergency communications plan by revealing that nearly two thirds of respondents (62%) to a global survey had activated their plan during the previous year. The urgency of emergency communications is further highlighted by over three quarters of those activations taking place within 30 minutes of an incident commencing.
The Emergency Communications Report, supported by Everbridge, noted that over a quarter of emergency communications plans do not request a response when activated. This is a worrying statistic as, if an incident is important enough to justify the plan being activated, then surely it warrants knowing that the message has been received by the intended recipients.
Further findings from the report include
- 14% of respondents reported that they do not have an emergency communications plan.
- Of these which do not have an emergency communications plan, over two-thirds (68%) state they would only create one after a business affecting event.
- Email is the primary method of communication used during an emergency with 83% claiming to use this, while 63% use manual call trees, 55% use emergency communication software, 55% use crisis telephone lines and 53% use website announcements
- Over two-thirds of respondents noted that their organization has emergency communications training and education with regularly scheduled events.
- Nearly three quarters of respondents (72%) stated that their plan is exercised at least once per year, with a further 16% stating it is done at last twice per year
- Common triggers for activating the emergency communications plan include unplanned IT outages (50%), weather related incidents (49%), power outages (47%), natural disasters (45%) and fire (42%)
- Over two-thirds of respondents (70%) use mobile communications in private messaging to staff.
Patrick Alcantara, Research Associate at the BCI and author of the report, commented: “Reliable emergency communications saves lives and demonstrates how organizations approach their duty of care to their employees, customers and stakeholders. The survey results affirm that many organizations take this duty seriously and offer opportunities for further improvement. We thank Everbridge for supporting this study and sharing our goal of producing top-quality research that impacts practice.”
Imad Mouline, Chief Technology Officer at Everbridge, commented: “The findings highlight that the unpredictability of global threats continue to necessitate a comprehensive enterprise critical communications strategy. While it’s refreshing to see that organizations are more actively developing plans, and using mobile as part of their strategy, there is still work to be done to ensure that communications are securely and reliably reaching employees and customers.”
The report concludes that top management buy-in and integration among different functional roles contribute to the successful embedding of emergency communications capability. Furthermore, organizations must focus on encouraging responses to emergency communications and this begins by defining acceptable response rates. This should be made easier as mobile communications are increasingly used by organizations as part of their emergency communications arrangements and technology has advanced so much so that this is a basic capability. Key to getting buy-in is education and training programmes which must be implemented as part of an overall holistic approach to continuity and resilience.
For more information, please contact the Senior Communications Manager at the Business Continuity Institute – Andrew Scott CBCI – by emailing email@example.com or by phoning 0118 9478241.
- Download a full copy of the report by clicking here.
- Note to the online survey: This report features 467 responses from 67 countries.
About the Business Continuity Institute
Founded in 1994 with the aim of promoting a more resilient world, the Business Continuity Institute (BCI) has established itself as the world’s leading Institute for business continuity and resilience. The BCI has become the membership and certifying organization of choice for business continuity and resilience professionals globally with over 8,000 members in more than 100 countries, working in an estimated 3,000 organizations in the private, public and third sectors.
The vast experience of the Institute’s broad membership and partner network is built into its world class education, continuing professional development and networking activities. Every year, more than 1,500 people choose BCI training, with options ranging from short awareness raising tools to a full academic qualification, available online and in a classroom. The Institute stands for excellence in the resilience profession and its globally recognised Certified grades provide assurance of technical and professional competency. The BCI offers a wide range of resources for professionals seeking to raise their organization’s level of resilience, and its extensive thought leadership and research programme helps drive the industry forward. With approximately 120 Partners worldwide, the BCI Partnership offers organizations the opportunity to work with the BCI in promoting best practice in business continuity and resilience.
The BCI welcomes everyone with an interest in building resilient organizations from newcomers, experienced professionals and organizations. Further information about the BCI is available at www.thebci.org.
Everbridge is a global provider of SaaS-based unified critical communications solutions. During mission-critical business events or man-made or natural disasters, the Everbridge platform enables customers to quickly and reliably deliver the right message and reach the right people, on the right device, in the right location, at the right time. Utilizing sophisticated communications technologies, Everbridge has the ability to deliver and verify messages in near real-time to more than 100 different communication devices, in over 200 countries and territories, in multiple languages – all simultaneously. Everbridge is based in Boston and Los Angeles, with additional offices in San Francisco, Beijing and London. For more information, visit www.everbridge.com, read the company blog, and follow on Twitter and Facebook.
At the Discover 2015 event in London, Hewlett-Packard Enterprise unveiled a new generation of “composable” systems that enable IT organizations to dynamically provision server, storage and networking resources within the system using a common application programming interface (API).
Gary Thome, vice president and chief engineer of converged data center infrastructure for HPE, says the HPE Synergy platform will for the first time give IT organizations managing IT infrastructure on premise the ability to programmatically manage the entire environment via a single high-level API.
Thome says that the HPE energy platform, scheduled to be available in the second half of 2016, is designed from the ground up to provide IT organizations the flexibility needed to address rapidly changing application workload requirements in a matter of minutes.
From hurricanes to hail to droughts to tornadoes, 2015 was a busy year for extreme weather events. Drought in California continued to worsen, increasing the risk of wildfires. While record rainfall in Texas and Oklahoma alleviated drought, it caused severe flash flooding in Texas. There have been 25 Category 4-5 northern hemisphere tropical cyclones—the most on record to date, breaking the old record of 18 set in 1997 and 2004.
The Insurance Information Institute reported that insured losses from natural disasters in the United States in just the first half of 2015 totaled $12.6 billion—well above the $11.2 billion average in the first halves of 2000 to 2014.
Interstate Restoration provides a look at 2015 weather events:
Why do online shoppers have to take special precautions?
The Internet offers convenience not available from other shopping outlets. From the comfort of your home, you can search for items from multiple vendors, compare prices with a few mouse clicks, and make purchases without waiting in line. However, the Internet is also convenient for attackers, giving them multiple ways to access the personal and financial information of unsuspecting shoppers. Attackers who are able to obtain this information may use it for their own financial gain, either by making purchases themselves or by selling the information to someone else.
How do attackers target online shoppers?
There are three common ways that attackers can take advantage of online shoppers:
- Creating fraudulent sites and email messages – Unlike traditional shopping, where you know that a store is actually the store it claims to be, attackers can create malicious websites or email messages that appear to be legitimate. Attackers may also misrepresent themselves as charities, especially after natural disasters or during holiday seasons. Attackers create these malicious sites and email messages to try to convince you to supply personal and financial information.
- Intercepting insecure transactions – If a vendor does not use encryption, an attacker may be able to intercept your information as it is transmitted.
- Targeting vulnerable computers – If you do not take steps to protect your computer from viruses or other malicious code, an attacker may be able to gain access to your computer and all of the information on it. It is also important for vendors to protect their computers to prevent attackers from accessing customer databases.
How can you protect yourself?
- Do business with reputable vendors – Before providing any personal or financial information, make sure that you are interacting with a reputable, established vendor. Some attackers may try to trick you by creating malicious websites that appear to be legitimate, so you should verify the legitimacy before supplying any information. (See Avoiding Social Engineering and Phishing Attacks and Understanding Web Site Certificates for more information.) Attackers may obtain a site certificate for a malicious website to appear more authentic, so review the certificate information, particularly the "issued to" information. Locate and note phone numbers and physical addresses of vendors in case there is a problem with your transaction or your bill.
- Make sure your information is being encrypted – Many sites use secure sockets layer (SSL) to encrypt information. Indications that your information will be encrypted include a URL that begins with "https:" instead of "http:" and a padlock icon. If the padlock is closed, the information is encrypted. The location of the icon varies by browser; for example, it may be to the right of the address bar or at the bottom of the window. Some attackers try to trick users by adding a fake padlock icon, so make sure that the icon is in the appropriate location for your browser.
- Be wary of emails requesting information – Attackers may attempt to gather information by sending emails requesting that you confirm purchase or account information. (See Avoiding Social Engineering and Phishing Attacks.) Legitimate businesses will not solicit this type of information through email. Do not provide sensitive information through email. If you receive an unsolicited email from a business, instead of clicking on the provided link, directly log on to the authentic website by typing the address yourself. (See Recognizing and Avoiding Email Scams.)
- Use a credit card – There are laws to limit your liability for fraudulent credit card charges, but you may not have the same level of protection for your debit cards. Additionally, because a debit card draws money directly from your bank account, unauthorized charges could leave you with insufficient funds to pay other bills. You can minimize potential damage by using a single, low-limit credit card to making all of your online purchases. Also use a credit card when using a payment gateway such as PayPal, Google Wallet, or Apple Pay.
- Check your shopping app settings – Look for apps that tell you what they do with your data and how they keep it secure. Keep in mind that there is no legal limit on your liability with money stored in a shopping app (or on a gift card). Unless otherwise stated under the terms of service, you are responsible for all charges made through your shopping app.
- Check your statements – Keep a record of your purchases and copies of confirmation pages, and compare them to your bank statements. If there is a discrepancy, report it immediately. (See Preventing and Responding to Identity Theft.)
“Cloud compliance” used to be a dirty word – or at least a scary one. Security concerns ran rampant in the wild west of the cloud’s early days, and IT professionals fell into two camps: those who moved forward with cloud services, holding their collective breath against perceived risk, and those who simply stayed away from the public cloud altogether.
These days, both strategies are unnecessary. Cloud compliance is now very possible, and very do-able, as long as you avoid these five cloud compliance misfires:
The Business Continuity Institute is delighted to welcome Elaine Tomlin MBCI to its Board of Directors, taking over as Membership Director from Bill Crichton FBCI whose time in the role has come to end.
Elaine is the Business Continuity Manager at Certus Ltd in Ireland and the BCI's Global Membership Council representative for Europe. It was her fellow members of the GMC that voted Elaine onto the Board.
On taking up the new role, Elaine commented: "I am truly delighted and honoured to be elected to the BCI Board of Directors. I have been a BCI member for the past 10 years and certainly didn’t predict that one day, I would be joining the Board and supporting the strategic direction of this global organisation. With that in mind, I look forward to meeting the rest of the Board, bringing new ideas to the table, supporting members and delivering strategic plans and activities. I also look forward to my continued involvement and activities within the BCI Global Membership Council. A busy year ahead!"
The BCI would like to thank Bill Crichton for the dedication he has shown to the industry and the Institute during his time as Membership Director. During his time on the Board, the Institute has seen significant change and this is in no small part to the effort put in by Bill. Of course Bill will continue to dedicate some of his time to the BCI in his role as the Chair of the 20/20 Think Tank’s UK Group.
I’m sure we can all agree that when an IT organization experiences a system disruption or outage, the sooner the right people know about it, the sooner the issue will be resolved, and the better the outcome for the business. So what does the organization need to have such a system in place, and to give it the best shot at a quick resolution?
I had the opportunity to discuss this topic in a recent email interview with Vincent Geffray, senior director of product marketing at Everbridge, a Glendale, Calif.-based provider of unified communications systems for incident alerting and management. Geffray explained what’s needed—and why it’s needed—this way:
The enterprise is doing its best to build the infrastructure needed to support Big Data, but not surprisingly most organizations are already starting to feel the strain. Data, after all, has a way of accumulating faster than either hardware or software can handle it, even in the age of rapid scalability.
And as many IT executives are finding out, there is more to Big Data than simply finding a place to store and analyze it.
According to a recent survey of U.S. and UK executives from Researchscape International, nearly half say their current data warehousing platforms are starting to break due to rising analytic workloads. Perhaps coincidentally, about half say they are employing new platforms like Hadoop and Spark for Big Data while the other half is trying to leverage legacy platforms, although there was no indication as to whether it was the latter group that was experiencing the most severe growing pains. A key complaint by about a third of the group, however, was that volume growth is pushing the warehousing budget to unsustainable levels.
You can reduce power in the data center in several ways, from more efficient hardware designs to advanced load balancing and infrastructure management software. But sometimes the direct approach is the most effective: If you want to lessen the power draw, employ low-power hardware.
On the data side of the house, power consumption is mostly a matter of the processors you choose. Also, with new generations hitting the channel that promise greater performance within a lower power envelope than current devices, many organizations will see their power consumption lessen as part of the normal hardware refresh process.
You couldn’t really call it “business as usual” in Paris these days: the hectic pace has slowed; the city is nearly void of tourists – a big concern, as tourism (domestic and international) brought in a total of 149-Billion Euros in 2013, 7.3% of French GDP. There has been nary an American accent to be heard in the city’s tourist districts since the Nov 13 terrorist attacks. Today, some to weeks later, traffic is still lighter than usual, waiting lists at popular restaurants have shrunk, and many shops have begun holiday promotions early. Business leaders are steeling themselves for a tough fourth quarter; political leaders are working at allaying fears and working collectively with their cross-border colleagues to combat the terror threat that has now effectively moved beyond specific targets.
But this is not to say that long-term planning on both the business and political fronts has been put on hold or even on the back burner. To wit, the American Chamber of Commerce in France last week released its 16th annual Barometer gauging the mood of American investors in France. The Barometer this year surveyed 125 American companies with offices in France, representing 50,000 employees and more than $40-Billion dollars in revenues across a wide specter of activities from manufacturing to financial services to technology. The survey was conducted before the November 13 terrorist attacks.
A survey of businesses in Northern Ireland has revealed the majority of firms in the country regard data breaches as the biggest corporate crisis risk they face.
The study, by law firm Pinsent Masons, revealed 83 per cent of organisations named the loss of corporate or customer data as their biggest threat, ahead of issues such as health and safety accidents or becoming embroiled in a bribery or corruption investigation.
As well as the immediate costs associated with recovering from such an incident, the reputational damage a firm can experience in the event that sensitive information is compromised can be wide-reaching.
(TNS) - The Atlantic hurricane season that marked the 10th anniversary of Hurricane Wilma flooding Florida Keys shores closes Monday without a serious local storm threat this year.
In fact, Wilma in October 2005 was the last hurricane to make landfall anywhere in Florida, extending a record streak of no state landfalls.
The six-month 2015 hurricane season that ends Monday saw three hurricanes, two of which blew into major storms. Hurricane Joaquin almost reached Category 5 status as it raked the Bahamas but it curved out to sea before affecting Florida.
(TNS) - The man soon to become Philadelphia's 99th mayor is not on board with one of the last initiatives of the 98th.
Mayor-elect Jim Kenney has asked the Nutter administration to delay plans to combine the city's 911 emergency dispatch systems and the 311 nonemergency call center under one roof in South Philadelphia.
The request comes as the administration is trying to finalize a 10-year lease agreement on space for a combined operation, dubbed the Unified Call Center, at 20th and Oregon Streets.
In a letter sent to Mayor Nutter on Nov. 20, Kenney said he had "significant concerns" about the cost of the new space and the availability of funds to pay for creating the center.
When the news broke in early October that Dell was planning on buying EMC for a whopping $67 billion, more than a few jaws dropped (including mine), but in the weeks since, reports have surfaced about multiple problems from tax issues to VMware’s plunging stock price and the post-deal creation of Virtustream.
It’s too soon to say the deal is in jeopardy, but there are enough issues that this has to be giving Michael Dell and EMC CEO Joe Tucci some cause to worry (and perhaps pop more than their usual supply of antacid), while giving lawyers, accountants and investment bankers lots of billable hours to sort it all out.
Where to begin.
First, a bit of background: EMC owns an 80 percent stake in VMware, but VMware is traded as a separate company and operates independently with its own stock and board of directors. In a blog post shortly after the deal was announced, Michael Dell sought to reassure customers and partners (and presumably shareholders) that he wasn’t going to mess with VMware when Dell takes over next year.
It’s not clear that the message got through to the shareholders.
(TNS) - The earthquakes just keep coming. Four days after a 4.7 magnitude earthquake was recorded southwest of Cherokee, a 4.4 magnitude earthquake was recorded Monday near Hennessey and Tuesday a 3.0 magnitude quake sprang up about 40 miles southeast of Norman, capping off a run of 23 earthquakes of magnitude 3.0 or higher in a seven-day period.
In response to Thursday’s quake, the Oklahoma Corporation Commission released a plan calling for two disposal wells to stop operations and for many others to cut down in volume.
Oklahoma Geological Survey Director Jeremy Boak said it’s a smart move because he said there’s a clear link between disposal wells and seismic activity in Oklahoma and he would like to see a balanced approach that allows scientists and policy makers to gather more information.
The backbone of America – banks, oil and gas suppliers, the energy grid – is under constant attack by hackers.
But the biggest cyberattacks, the ones that can blow up chemical tanks and burst dams, are kept secret by a law that shields U.S. corporations. They're kept in the dark forever.
You could live near -- or work at -- a major facility that has been hacked repeatedly and investigated by the federal government. But you'd never know.
What's more, that secrecy could hurt efforts to defend against future attacks.
The murky information that is publicly available confirms that there is plenty to worry about.
Big data technology vendors are helping to close the skills gap when it comes to training the next crop of data scientists. Here's what they are offering.
There is a bottleneck big data must eliminate on its journey from buzzword to mainstream acceptance, someone called the data scientist.
This person has a PhD in math or statistics and is trained to fish for insights in the data lake. This person crafts algorithms like fishing flies, casting queries out like a line, luring insights to the surface where they can be hooked like trout.
To be a data scientist, one must combine domain experience, a deep background in statistics and math, and programming skills, noted Leon Kutsnelson, director and CTO for IBM Analytics emerging technologies. "We call them 'unicorns' because they don't exist," he said. If industry had to depend on PhDs to do big data, "we [would] continue to sit on mountains of data," he added.
The world is experiencing a digital revolution that is rapidly changing your business landscape. This revolution is not only connecting people with digital technology; but it has made this technology ubiquitous in all of our lives. The result is tech-savvy employees and customers with new expectations how to interact with your business.
Meeting these expectations requires a transformation into a consumer technology company by utilizing digital technology to streamline operational processes and improve customer experiences. Industry data indicates those organizations begin this transformation process experience increased profitability, market value and revenue.
Cisco’s private cloud automation facilitates the transformation of your business from manual to automated standardized service delivery. Converting manual processes into automated workflows increases data center productivity which fuels faster time-to-market by business and application teams of new products or services.
(TNS) - The US weather provided little to be grateful for this Thanksgiving, with rain, snow and freezing rain affecting many parts of the country.
At least 14 people have died as a result of the precarious conditions.
The heaviest snow fell across the mountain states and the Central Plains. Heavy snow and whiteout conditions were reported from the Dakotas to the Great Lakes.
Sioux Falls in South Dakota only recorded 4cm of snow, but as that mixed with freezing rain, widespread warnings were issued as conditions became treacherous.
Big data is becoming an increasingly important part of the business plan for companies in many different industries. Analyzing large customer datasets and other kinds of data with tools like Hadoop reporting lets companies save money as well as boost revenue by targeting their marketing better, designing products to better appeal to their customers, make better predictions, and so on. On the other hand, this rise in the use of big data has coincided with the rise of advanced persistent threats to data security. Big data is not just lucrative to the companies that collect it: it is also worth money to identity thieves and other bad actors. This has given rise to a cottage industry in hacking and cracking. Companies that use big data, especially if that data consists of personal information of customers, are at an elevated risk of drawing hacking attempts. Developing ways to protect that data will prove to be just as important as the data itself.
The last few years have seen hacking capture headlines on a regular basis. Large companies like Target have become victims, with hackers stealing credit card information of millions of customers at a time. Even the U.S. government has been affected. The Federal Office of Personnel Management was breached earlier this year and detailed personal information of several million American citizens was stolen by unknown hackers. These breaches are only the latest of a string of such attacks. Furthermore, just because the largest companies are the most likely to make the news does not mean that smaller companies are safe. Hackers know that while large companies tend to control more data, small companies have less robust cyber-defenses, leaving them more vulnerable to organized attack.