When it comes to recent cybersecurity talks, the prevalent theme seemed to be, “We know we need to do something, but what?”
The recurring questions are: Where do we start, and how fast do we need to react to stop cyberattacks? What's become quite clear is that if we are to secure our digital world, we need to do it with technologies that run as fast as the networks and applications in which they operate — in milliseconds.
Repeated time and again in recent discussions is the need for proactive defensive measures in cybersecurity — and how quickly they must react to stop today's hacker. Even the language in the new cybersecurity bill seems to fall short of true cybersecurity protection, as it is more based on the sharing of information to assist in the detection and recovery of a cyberattack rather than a proactive cybersecurity solution that would stop the attack.
And this leads to a few important questions: Is there a big disconnect between the public and the private sectors when it comes to what cybersecurity is suppose to achieve? If so, what is that disconnect, and how can we move forward?
(TNS) - A violent rampage at UC Merced and threats of gunplay at Fresno State earlier this month are prompting universities to reassess the resources and policies in place to ensure safety and security on their campuses, and a school security training is being planned in Angels Camp.
Rural Domestic Preparedness Consortium will be delivering a Department of Homeland Security-certified course in crisis management for school-based incidents in an all-day training Dec. 21 at Bret Harte High School in Angels Camp. The course is free for first responders and school administrators with registration by Dec. 7.
At UC Merced, a student stabbed four people with a hunting knife Nov. 4 before being shot and killed by campus police. Two days earlier, a social media post attributed to a California State University, Fresno, student threatened that a shooting would take place that afternoon. Investigators made an arrest within hours.
Mark Armour and David Lindstedt recently proposed Continuity 2.0, a manifesto detailing how current approaches to business continuity planning might evolve. In this article Mark looks at how Continuity 2.0 might be applied in practice.
The following example is by no means definitive. Remember that the Continuity 2.0 principles are not about order of execution. The three steps suggested here provide just one example of how the principles could be applied in a fairly concise execution. So, without further ado: a practical approach to Continuity 2.0 in three easy steps.
The peak of our current El Niño is expected to occur in the next month or so… but what does that mean? We measure El Niño events by how much warmer the surface waters in a specific region of the equatorial Pacific are, compared to their long-term average. The difference from average is known as the “anomaly,” and we use the average anomaly in the Niño3.4 region as our primary index for El Niño. When the index in this region is at its highest, we have our peak El Niño.
However, El Niño-related impacts have been occurring around the globe for months already, and will continue for several months after the warmest temperatures occur in the tropical Pacific Ocean. For example, during the 1997-98 El Niño, the Niño3.4 Index peaked at 2.33°C in November (using ERSSTv4 data, the official dataset for measuring El Niño), and the most substantial U.S. effects occurred through the early spring of 1998. A bit later in this post, we’ll take a look at what’s been going on so far this year.
First, a quick update on the recent El Niño indicators
The average anomaly in the Niño3.4 region during August-October of this year was 1.7°C, second to the same period in 1997 (1).
The atmospheric response to the warmer waters is going strong. The Walker Circulation (tropical near-surface winds blowing from east to west, and upper-level winds blowing from west to east) is substantially weakened, as we expect during a strong El Niño.
In case you’re unimpressed by a 2°C (3.6°F) change, let’s do a little math. The area covered by the Niño3.4 region is a little more than 6 million square kilometers (2.4 million square miles). One cubic meter of water weighs 1,000 kg. So the top two meters (6.6 feet) of the Niño3.4 region contains about 12 quadrillion kilograms (about 13.6 trillion tons) of water.
The energy required to raise one kilogram of water one degree Celsius (the “specific heat”) is 4.19 kilojoules. A 2°C increase in just the top two meters of the Niño3.4 region adds up to an extra 100 quadrillion kilojoules (95 quadrillion BTUs), about equal to the annual energy consumption of the U.S.!
Who’s feeling the effects?
In the U.S., the season of strongest El Niño impacts is December through March. While we’re waiting to see what the strong 2015-16 El Niño brings us, we’ll look around a few other corners of the world to see what’s happened so far.
El Niño has substantial impacts in two regions of Africa. I checked in with the Climate Prediction Center’s International Desk to see what’s been going on. In East Africa, including Ethiopia, Somalia, Kenya, Tanzania, Uganda, Burundi, and Rwanda, the primary impact season is October–December, when El Niño tends to enhance the ”short rains” rainy season (the “long rains” season, which is much less ENSO-sensitive, is March-May), leading to wetter conditions. Over the last month, rain has begun to increase across much of the area, and some flooding has been seen in Somalia. Short-term forecasts suggest the wetter conditions should continue through the next few weeks, at least.
Southern Africa, including Zimbabwe, Botswana, Namibia, Angola, South Africa, Lesotho, Swaziland, and the southern half of Mozambique, tends to see a drier December–February during an El Niño. Areas of this region, especially South Africa, are very dry right now, after a failed monsoon last year. Another dry year would place more stress on water availability. You can check out recent rainfall conditions in Africa here, and see climate model forecasts for the continent here.
In a couple of short sentences, here are some huge impacts: El Niño-related dry conditions in Indonesia have set the stage for devastating fires, and the region is experiencing the greatest number of forest fires since 1997. Also, all the extra warm waters associated with this El Niño are placing heat stress on sea life, and an intense coral bleaching event is underway.
El Niños tend to enhance the hurricane season in the Pacific, and depress the Atlantic hurricane season. Phil Klotzbach of Colorado State University had this to say about the wild Pacific hurricane season: “So far this year, there have been a total of 21 Category 4 and 5 storms in the North Pacific, shattering the old record of 17, set in 1997. The North Central Pacific region (140-180W) has shattered records for most named storms, hurricanes, and major hurricanes tracking through the 140-180W region.”
According to Lindsey Long of the Climate Prediction Center, the Atlantic season has been fairly quiet, although the number of named storms has been close to average, at 11 storms so far (including Kate, which formed on Monday). The average is about 12… but the overall activity of this storm season (the combined strength and duration of all storms, measured as the Accumulated Cyclone Energy (ACE) has been less than 60% of average, and we’ve had 3 hurricanes, half the average number of 6.
We won’t know until next spring what exact impact this El Niño will have on the U.S., but it is already making its presence felt around the world.
(1) Note that CPC subtracts past 30-year “normals” from the current sea surface value to obtain the Nino-3.4 anomaly values, and the “normals” are updated every five years. Therefore, the long-term trends are removed. These monthly values are averaged together to obtain our Oceanic Niño Index [ONI].
A couple of recent studies show that companies continue to struggle with endpoint security. This has to be a serious concern as more employees are connecting to the corporate network through multiple devices.
Let’s look at these different studies. First, last week, MeriTalk and Palo Alto Networks released the Endpoint Epidemic report, which looks at endpoint security within federal government. Government agencies are failing badly when it comes to endpoint security: 44 percent of endpoints are either unknown or unprotected, and little is being done by up to half of the agencies to do anything about it, as SC Magazine pointed out:
Just over half of federal IT managers (54 percent) responded that their current policies and standards are very effective, practical or enforceable. Further, less than half said their agency's endpoint security policies and standards are very well integrated into their overall IT security strategy. And, half said their agency isn't taking key steps to validate users and apps.
Cutter Fellow Bob Charette has been blogging over at IEEE Risk Factor for the past decade, looking at the myriad ways software projects fail. To mark that 10-year milestone, he set out to analyze what’s changed — and what hasn’t — in the area of systems development- and operations-related failures.
Bob doesn’t claim to have compiled a comprehensive “database of debacles” in Lessons From a Decade of IT Failures. Instead, he’s endeavored to bring together the “most interesting and illustrative examples of big IT systems and projects gone awry.” Be sure to spend some time with his colleague Josh Romero’s five super cool interactive visualizations of the data where you’ll:
Transforming an acquired technology into a fully integrated product.
In 2014, Citrix acquired a company called ScaleXtreme, as part of our expansion into the world of enterprise SaaS solutions. ScaleXtreme was a powerful tool for automating delivery and management of IT services, and my design team was asked to redesign it to fit in with our existing products.
At the same time, we had to find a way to integrate the new product into an entirely new platform – Citrix Workspace Cloud — that was still being developed.
This was a multi-dimensional challenge — one that many companies have to deal with. Success is far from guaranteed and there are many potential pitfalls. It helps to have a clear strategy, early customer input, and most importantly teams who all work together to find the right solution.
You probably have an image in mind when you think about Godzilla versus The Blob.
Better yet, you’re probably wondering what these iconic monsters have in common with winter weather. Well, we’re not talking about your typical 1950’s monster classics.
Two major climate anomalies are taking place at the same time this year: “Godzilla” and “The Blob”. Those are the names given to two Pacific Ocean surface temperature patterns that are expected to converge later this year and into 2016 (there's also a "Son of Blob, but we'll save that for the sequel). The showdown between the two is expected to result in a more prolonged El Niño season causing even more unpredictable, potentially severe weather for the United States.
Some experts are predicting that the concurrent timing of Godzilla and The Blob could deliver the U.S. the harshest El Niño weather event in history. The last major El Niño event was in 1997, when it was, according to some experts, a contributor to severe weather that cost billions in damages and a number of deaths. This year’s El Niño is expected to be one of the strongest in over 60 years. The “battle” between the two weather monsters can cause major implications for every region of the United States.
Individuals or groups can be nominated until January 8, 2016
As part of President Obama's Climate Action Plan and the National Fish, Wildlife & Plants Climate Adaptation Strategy, an interagency group of federal, state, and tribal agencies today announced creation of a new Climate Adaptation Leadership Award for Natural Resources.
The Award will recognize the actions of individuals and organizations that are making a difference by increasing understanding of climate impacts, adapting to and reducing threats, increasing response capabilities, and providing other innovative approaches to reducing impacts and increasing resilience in a changing climate. It will help spotlight innovative tools and actions that are making a difference now, and serve as a source of inspiration for additional efforts that advance climate smart resource conservation and management.
"Our climate is changing, and these changes are already affecting the nation's valuable wildlife and natural resources," said Michael Bean, Principal Deputy Assistant Secretary of the Interior for Fish and Wildlife and Parks. "This new Award recognizes outstanding leadership by organizations and individuals that is critical to help advance the resilience of our natural resources and the people, communities, and economies that depend on them."
Volunteers help plant native salt marsh grass as part of a 30 acre restoration of Beaver Dam Creek on Great South Bay, Long Island, New York. (Credit NOAA)
Establishment of the Climate Adaptation Leadership Award for Natural Resources was one of the commitments announced as part of the Administration's Priority Agenda for Enhancing the Climate Resilience of America's Natural Resources in 2014.The agenda calls for a commitment across the federal government to support resilience of America's vital natural resources.
The Award also directly addresses the goals of the National Fish, Wildlife, and Plants Climate Adaptation Strategy, which was developed by a coalition of federal, state, and tribal natural resource agencies. These include:
- Goal 1: Conserve and connect species, habitats and ecosystems;
- Goal 2: Manage species and habitats to protect ecosystem functions and provide sustainable use;
- Goal 3: Enhance management capacity;
- Goal 4: Support adaptive management;
- Goal 5: Increase knowledge and information on natural resource impacts and responses to climate change;
- Goal 6: Increase awareness and motivate action to safeguard natural resources; and
- Goal 7: Reduce non-climate stressors to natural resources.
"State fish and wildlife agencies serve as stewards of the nation's fish and wildlife resources," said Dave Chanda, President of the Association of Fish and Wildlife Agencies, which is helping to lead implementation of the National Fish, Wildlife, and Plants Climate Adaptation Strategy. "Today's threats to fish, wildlife, and their habitats are exacerbated by climate change and underscore the need for incorporating climate adaptation in to conservation and science-based management." Nominations will be accepted until January 8, 2016. Individuals, groups, organizations and government agencies are eligible to apply. Three to five Awards are expected to be announced in 2016.
Fish, wildlife, and plant resources provide important benefits and services to Americans every day, including jobs, income, food, clean water and air, building materials, storm protection, tourism and recreation. For example, hunting, fishing and other wildlife-related recreation contribute an estimated $120 billion to our nation's economy every year, and marine ecosystems sustain a U.S. seafood industry that supports approximately 1.7 million jobs and $200 billion in economic activity annually.
Award sponsors include the U.S. Department of the Interior, U.S. Fish & Wildlife Service, the Commerce Department's National Oceanic and Atmospheric Administration, the Natural Resources Conservation Service and the U.S. Forest Service. They will sponsor the award in collaboration with the National Fish, Wildlife, and Plants Climate Adaptation Strategy's Joint Implementation Working Group, which is composed of representatives from 21 federal, state and tribal natural resource agencies.
For more information about the Award or how to apply, please visit the Climate Adaptation Leadership Award main page.
NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook, Twitter, Instagram and our other social media channels.
By Jim Whelan and Christine Taylor, The Taneja Group
Virtual backup appliances (VBAs) are an instance of backup software running in a virtual machine on a general purpose server. VBAs are flexible and effective, and they are usually simple to deploy. Having said that, physical backup appliances (PBAs) have distinct advantages in several areas.
PBAs consist of a self-contained, tuned hardware platform which has everything you need to perform backups and recovery already installed on it, including compute, storage and software, making it a plug-and play-solution. Capacities generally range from under 10 TB to larger appliances offering more than 200 TB, making them attractive to customers ranging from SMBs all of the way up to the enterprise.