(TNS) - When a 7.8-magnitude earthquake hit Baguio on July 16, 1990, 5-year-old Klaridelle Reyes was sleeping on a couch. She woke up to a cacophony of voices and loud footsteps. She could hear people shouting, running to safety.
Kyle Yan, a 16-year-old student at Saint Louis University, was also napping on that cold afternoon when the quake struck. He awoke in the commotion and then waded through piles of books and personal belongings that had fallen to the floor during the first few seconds of the quake.
Outside, buildings were starting to crumble, landslides blocked roads and mines collapsed on hapless workers.
When the Rana Plaza building collapsed in Bangladesh, it wasn’t the physical disruption to the supply chain that caused the most damage to organizations at the top, it was the reputational damage as a result of the poor safety standards and human rights abuses taking place further down the chain. A disruption in one organization, whether physical or reputational, will have an impact throughout the entire supply chain.
Have organizations learnt their lesson from the incident above? The risk of organizations breaching international human rights regulations has risen significantly over the last quarter as key Asian economies adapt to tougher economic conditions. That is the conclusion of the latest Risk Index Report from BSI which it identifies China, India, Vietnam, Bangladesh and Myanmar as the five highest risk countries for human rights violations. These countries account for 48% of global apparel production, 53% of global apparel exports, and 26% of global electronics exports.
The Quarterly BSI Risk Index is based on intelligence from BSI’s Supply Chain Risk Exposure Evaluation Network (SCREEN) tool, which provides real-time incident reports for corporate social responsibility (CSR), security, and business continuity risk, threats, from over 20 proprietary risk categories across 200 countries. Supply Chain Intelligence from SCREEN identifies major CSR concerns, such as brand protection risks and changes to global regulation including the US legislation aimed at eliminating forced child labour, EU draft conflict minerals law, and the UK’s Modern Slavery Act. All of which relate directly to complex supply chains worldwide and can subject an organization to prosecution if their suppliers exploit human rights.
In addition to the legal repercussions, an organization’s brand reputation and consumer trust is compromised. The latest generation of consumers, millennials, are focused on buying from ethical and responsible businesses, highlighting the increased importance for organizations to adopt a supply chain risk management program and implement risk-based sourcing strategies. Understanding country-level threats provides the needed intelligence to filter risk to underpin a socially compliant and responsible supply chain.
The latest BSI Risk Index report warns that efforts by Asian governments to boost their economies are having the unintended consequence of allowing child labour abuses to become more present in supply chains. Also highlighted were proposed changes to labour laws that may incentivise firms to restructure as 'family enterprises', making it easier to employ underage workers in a country where 4.4 million children are already put to work.
Mike Bailey, EMEA Director of Professional Services at BSI, commented: “Organizations can no longer turn a blind eye to the actions of their suppliers. The laws we are seeing today may only apply to larger firms, but they set a benchmark for the industry and smaller organizations will be forced to comply to work with the larger companies, by default. Products assembled or services provided by child labour or depending on minerals from conflict zones have no place in the modern world.”
Less than a third (31%) of global economic losses as a result of natural disasters were covered by insurance (including both private insurers and government-sponsored programs) during the first half of 2015, according to a new study by Aon Benfield. This is slightly above the 10-year average of 27% because the majority of the losses occurred in regions with higher insurance penetration.
By contrast, around 2% of the multi-billion-dollar economic loss from the Nepal earthquake was covered by insurance. Statistics like this show how catastrophe models can play a role in helping the insurance industry to better understand these risks and seek ways to grow insurance penetration in underserved regions.
On a more positive note, losses during the first half of 2015, from both an economic and insured loss perspective, were each below the 10-year (2005-2014) average. Preliminary data from the Global Catastrophe Recap: First Half of 2015 report determined that economic losses were US$46 billion, down 58% from the 10-year average of US$107 billion, and insured losses were US$15 billion, down 47% from the 10-year average of US$28 billion.
The severe thunderstorm peril was the costliest disaster type, comprising 33% of the economic loss and 49% of the insured loss. Most of the costs were attributed to strong convective thunderstorm events that prompted widespread hail, damaging straight-line winds, tornadoes, and major flash flooding in the United States during the months of April, May and June.
A clear majority (73%) of the insured losses were sustained in the United States due to an active winter season combined with numerous spring severe convective storm events. Asia Pacific was second with 14% and Europe, Middle East & Africa was third with 11% of the insured loss.
Steve Bowen, associate director and meteorologist with Aon Benfield's Impact Forecasting team, said: "The first half of 2015 was the quietest on an economic and insured loss basis since 2006. Despite having some well-documented disaster events in the United States, Asia Pacific and Europe, it was a largely manageable initial six months of the year for governments and the insurance industry. Looking ahead to the rest of 2015, the continued strengthening of what could be the strongest El Nino in nearly two-decades is poised to have far-reaching impacts around the globe. How that translates to disaster losses remains to be seen, but something to keep a close eye on in the coming months."
As many parts of the United States enter another day of high heat and humidity, we’re reading about the first ever heatwave warning guidelines issued by the United Nations earlier this month.
The guidelines are intended to alert the general public, health services and government agencies via the development of so-called heatwave early warning systems that should ultimately lead to actions that reduce the effects of hot weather extremes on health.
As the foreword to the publication states:
Public cloud services such as Microsoft Azure and Amazon Web Services will attract almost $70 million (£45 million) in spending globally through 2015.
This is according to a new report from International Data Corporation (IDC), published last week (July 21st).
Entitled Cloud Computing: The Essential Foundation of Industry Digital Transformation, the study forecast that approximately 45 per cent of this spending will occur within five key verticals – discrete manufacturing, banking, professional services, process manufacturing and retail.
It also predicted that the number of new cloud products will triple in the next four to five years, with what IDC calls “intelligent industry solutions” – sector-specific cloud platforms that offer big data, mobile and social functionality – driving the bulk of new opportunities in key verticals.
“The technological innovations and enabling capabilities unleashed by cloud have fostered new opportunities across the industries,” commented Eileen Smith, program manager for IDC’s Global Technology and Research Group.
“As a result, it is necessary for both technology vendors and buyers to recognise the industry drivers and barriers of cloud deployment,” she continued.
One such barrier may be the difficulty of data recovery from a public cloud environment.
According to a survey of UK companies carried out by Connected Data in April, almost one in five firms (19 per cent) have “no idea” how long it would take to recover data and resume business if one of their public cloud services went down.
A further 26 per cent admitted they did not fully understand the costs associated with data recovery from cloud and virtual environments.
When looking for data recovery services, look for one with a track record of success. Ontrack Data Recovery services has 40,000 data recovery stories to tell every year.
WASHINGTON – August 2015 marks the tenth year since the devastating 2005 Atlantic Hurricane Season. According to the National Oceanic and Atmospheric Administration (NOAA), Hurricane Katrina was one of the strongest storms to impact the coast of the United States, causing widespread devastation and affecting an estimated 90,000 square miles along the central Gulf Coast states. Less than a month later, Hurricane Rita and then Hurricane Wilma in October made landfall compounding an already catastrophic situation.
Ten years into the recovery, FEMA continues to support communities and families, working side-by-side with state, local, and tribal partners to finish the job of rebuilding communities that are the economic engines and lifeblood of the Gulf Coast. To date, FEMA has provided $6.7 billion to more than one million individuals and households. FEMA provided more than $131 billion to the states of Louisiana, Mississippi, Alabama, and Florida for public works projects in the aftermath of Hurricane Katrina to assist with recovery efforts.
“Today, FEMA has the authority necessary to lean forward and leverage the entire emergency management team in response and recovery efforts,” said FEMA Administrator Craig Fugate. “This team includes not only government but also the private sector, non-profits, and citizens themselves. We support survivors and this holistic approach emphasizes the importance of working as a team to prevent, protect against, respond to, recover from, and mitigate all hazards.”
Since 2005, FEMA has significantly improved its ability to assist communities in responding to and recovering from disasters. With the support of Congress, FEMA was provided additional authorities and tools to become a more effective and efficient agency, one that is focused on putting survivors first. Specifically, the Post-Katrina Emergency Management Reform Act (PKEMRA) of 2006, gave FEMA clear guidance on its mission and priorities, and provided the legislative authorities needed to better partner with state, local, tribal, and territorial governments before, during, and after disasters. These improvements include:
- Improved ability to provide support to states and tribes ahead of a disaster. Since 2005, FEMA gained statutory authority to surge resources to states, tribes, and territories ahead of a disaster should the capacity of states, tribes or territories become overwhelmed. This authority expedites FEMA’s ability to respond to disasters if and when a state, tribe or territory requests support and a disaster is declared by the President.
- Development of a National Disaster Recovery Framework (NDRF). PKEMRA required FEMA, along with its partners, to develop a national disaster recovery strategy to guide recovery efforts after major disasters and emergencies. The NDRF clearly defines coordination structures, leadership roles and responsibilities, and guidance for federal agencies, state, local, territorial, and tribal governments, and other partners involved in disaster planning and recovery.
- Establishment of Incident Management Assistance Teams. These full time, rapid response teams are able to deploy within two hours and arrive at an incident within 12 hours to support the local incident commander. The teams support the initial establishment of a unified command and provide situational awareness for federal and state decision makers crucial to determining the level and type of immediate federal support that may be required.
- Improved Search and Rescue capability. Since 2005, FEMA has better integrated search and rescue assets from across diverse Federal agencies such as the U.S. Coast Guard and the Department of the Interior.
- Establish the Regional Emergency Communications Coordination Working Groups (RECCWGs) to serve as the primary focal points for interoperable communications coordination among federal, state, local, tribal and territorial emergency responders. The statute charges these RECCWGs with coordinating effective multi-jurisdictional and multi-agency emergency communications networks for use during disasters and emergencies.
- Enhanced partnerships with the private sector. As part of this effort, FEMA established the National Business Emergency Operations Center that serves as a clearinghouse for two-way information sharing between public and private sector stakeholders in preparing for, responding to, recovering from, and mitigating disasters.
- Support for the inclusion of people with access and functional needs. The Office of Disability Integration and Coordination was established to provide technical assistance and guidance for a wide range of emergency management activities, including equal access to emergency programs and services and meeting the access and functional needs of the whole community. This includes: preparedness, exercises, emergency alerting, accessible transportation and shelter accessibility guidance, assistive technology devices for accessible communication, accessible housing and grant guidance to states for accessibility, and partnership and stakeholder outreach.
For more information on FEMA’s continued work to support communities and families along the Gulf Coast, visit our Hurricane Katrina: A Decade of Progress through Partnerships website.
FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain and improve our capability to prepare for, protect against, respond to, recover from and mitigate all hazards.
The social media links provided are for reference only. FEMA does not endorse any non-government websites, companies or applications.
Searchmetrics releases annual study of ranking factors for top ranking search results on Google.com
SAN MATEO – New research from Searchmetrics, the leader in Search Experience Optimization which includes SEO and Content Performance Marketing, concludes that Google is now better able and more focused than ever on giving higher search rankings to good quality web content that is easy to understand and relevant to the search query. Positive user signals such as time on site help Google assign relevance values to content. The company’s analysis indicates the days are numbered for old-style SEO tactics such as link building or emphasizing relevant keywords and search phrases.
Additional new findings include the fact that on average 30% of sites that appear in the top 30 Google US search results use responsive design to optimize the search experience by automatically adjusting the format to suit a mobile, tablet or computer. As increasing numbers of users search on the move, responsive web design is more important.
In its annual study Search Ranking Factors and Rank Correlations – Google US 2015, Searchmetrics once again analyzed the top 30 search results for 10,000 relevant keywords and 300,000 websites appearing on Google.com. The aim of the analysis (which has been carried out every year since 2012) is to identify the key factors that high ranking web pages have in common and provide insights and benchmarks to help marketers, webmasters and SEO professionals. The study measures the correlation between the presence of a wide list of factors amongst high-ranking Google.com search results.
The most important findings include:
- Relevant, comprehensive content is more important than ever
Factors associated with the quality of content on web pages are increasing in importance. Higher ranking pages tend to have more words and are better able to give searchers the information they are looking for by covering topics more comprehensively, as well as being easier to read and understand .Since last year the average word count on pages in the top 10 search results has increased by around a quarter (rising from 975 to 1285 words), while there is evidence that high ranking pages cover topics more comprehensively touching on a variety of related topics.
- Pay attention to user experience and user signals
Websites which rank higher are better structured, more often responsive, have a better internal link structure and offer a user-friendly experience. “Closely relate to the content and user experience are user signals such as time on site and bounce rates because they tell Google if people find your information useful and engaging,” said Tober. “Google can measure these signals very effectively, for example by analyzing user behavior from people who use its Google Chrome web browser. So if you want your pages to rank well, you can’t get away with providing content that isn’t relevant or by providing a poor experience.”
- Technical optimization is a basic requirement for rankings
In addition, technical factors such as having a title tag and description in a web page’s underlying source code, and having pages that are quick to load, are standard requirements that almost all pages in the top 30 results display.
- Decline of single keyword focus and relevance of backlinks
The number of backlinks to a page from other pages is still a factor that is highly correlated with search ranking positions but its importance is declining.
The most important message from the 2015 study is that as the trend away from keywords and towards relevant content is continuing, high-ranking sites are shifting their focus from using keywords based on search queries to trying to understand the user’s intention as a whole and reflecting this in quality, logically structure content.
Marcus Tober, CTO and Founder of Searchmetrics reemphasizes this finding: “Our research indicates that simplistic tactics that may have been effective in the past - such as increasing the number of keyword mentions on the page or using keywords in the domain name - are not enough to lift you up the search results. You need to try and understand the searcher’s intention and ensure your web content gives them the information they are looking for, ensuring you cover topics in sufficient detail.”
To download the full, newly designed Searchmetrics whitepaper please visit:
An infographic highlighting the results can be viewed here:
 Searchmetrics analysis reveals that high ranking pages tend to have content that is easier to read and understand using the Flesch readability scale. In 2015 the analysis has found that pages in the top ten search results have a slightly higher Flesh readability score than 2014, indicating that higher ranking pages are getting easier even easier to read.
About the study
Conducted in April 2015, the study analyzed Google US (Google.com) search results for 10,000 keywords and 300,000 websites, featuring in the top 30 positions, as well as billions of backlinks, Tweets, Google plus ones, Tweets, Pins and Facebook likes, shares and comments. The correlations between different factors and the Google search rankings were calculated using Spearman's rank correlation coefficient. This year the study underwent an overhaul and now features a brand new design including more intuitive charts.
Searchmetrics, founded in 2005 is the pioneer and leading global enterprise platform for Search Experience Optimization. Search Experience Optimization combines SEO, Content Performance Marketing, Social Media and PR analysis to create the foundation for developing and executing a successful content strategy. It places the spotlight on the customer, contributing to a superior and memorable online experience.
Over 100,000 users from more than 8,000 brands use the Searchmetrics Suite to plan, execute, measure and report on their digital marketing strategies. Supported by its Research Cloud, which is a unique continually updated global data and knowledge repository, Searchmetrics answers the key questions asked by SEO professionals and digital marketers. It delivers a wealth of forecasts, analytic insights and recommendations that boost visibility and engagement, and increase online revenue. Many respected brands, such as T-Mobile, eBay, Siemens, Zalando, Tripadvisor and Symantec, rely on the Searchmetrics Suite.
Searchmetrics has offices in Berlin, San Mateo, New York, London, and Paris, and is backed by Holtzbrinck Digital, Neuhaus Partners and Iris Capital.
Enhancements to Avocent® Universal Management Gateway Appliance Reduce Risk of Intrusion and Expand Multi-Vendor Support
HUNTSVILLE, Ala. – Emerson Network Power, a business of Emerson (NYSE: EMR) and the world’s leading provider of critical infrastructure for information and communications technology systems, today introduced firmware updates to its Avocent® Universal Management Gateway (UMG) appliance that further harden the industry’s preeminent secure UMG platform. The enhanced Avocent UMG is available globally.
Already the only management system on the market that can manage data center infrastructure through service processor, serial and KVM (keyboard, video, mouse) connections, the enhanced Avocent UMG further refines root-level access controls, reducing the risk of malicious intrusion. The latest firmware release (available here) provides updated support for a long list of service processors—including recent additions Oracle iLOM3, Dell iDrac8 and Dell Series C—and service processor firmware. These enhancements, and the Avocent UMG’s ability to isolate the management network from the rest of the environment, provide an added layer of security that bolsters overall data center security in multi-vendor environments.
Other recent enhancements to the Avocent UMG are designed to bolster the system’s ability to provide alerts of failures and potential failures in multi-vendor environments. Data center managers can configure and opt-in to alerts tailored to the most relevant platform event traps (PETs)—including everything from the failure of the CPU, hard drive or server processor, to an OS blue screen—across multiple vendors and systems.
“Managing IT security is a complex task that grows in complexity with the introduction of new technologies,” said Jay Wirts, general manager, Avocent Core Products, Emerson Network Power. “With today’s firmware enhancements to the Avocent UMG appliance, we have taken another significant step toward enabling not just a more secure network, but a more flexible, productive data center.”
The Avocent UMG is the industry’s only consolidated management appliance enabling secure, centralized remote management of almost any device in any location. The enhanced Avocent UMG functions in a heterogeneous environment, identifying and accessing servers and service processors whether connected physically to the appliance or through the network. That access eliminates the need for inputting multiple IP credentials, logins and passwords, streamlining network operations and machine-to-machine communication.
Users can save Avocent UMG configuration files from Web UI—an HTML-based application used to configure and manage appliances remotely—eliminating the need to go through setup every time the Avocent UMG configuration has been changed. Other recent enhancements to the Avocent UMG enable users to record up to 40 simultaneous KVM sessions and patch three recent security vulnerabilities (POODLE, GHOST, JetLeak).
To learn more about Emerson Network Power’s hardware and software solutions, please visit www.EmersonNetworkPower.com.
About Emerson Network Power
Emerson Network Power, a business of Emerson (NYSE:EMR), is the world’s leading provider of critical infrastructure technologies and life cycle services for information and communications technology systems. With an expansive portfolio of intelligent, rapidly deployable hardware and software solutions for power, thermal and infrastructure management, Emerson Network Power enables efficient, highly-available networks. Learn more at www.EmersonNetworkPower.com.
Emerson (NYSE: EMR), based in St. Louis, Missouri (USA), is a global leader in bringing technology and engineering together to provide innovative solutions for customers in industrial, commercial and consumer markets around the world. The company is comprised of five business segments: Process Management, Industrial Automation, Network Power, Climate Technologies, and Commercial & Residential Solutions. Sales in fiscal 2014 were $24.5 billion. For more information, visit www.Emerson.com.
Affinity Sutton is delighted to announce that it has selected Hitachi Solutions Europe to deploy a Microsoft Enterprise Resource Planning (ERP) solution across the whole of its business. The selection of Hitachi Solutions is the result of a robust selection process which started in 2014. ERP systems are used widely in the corporate sector and provide an integrated solution for core business processes; often in real-time, maintained by a single database management system. The applications that make up the system share data across all the departments and functions in a company.
Mark Washer, Group Finance Director said “this is a hugely exciting appointment and marks what we hope will be the start of a long and successful partnership with Hitachi Solutions to deliver state of the art, fully integrated systems for the benefit of our customers and staff. We are making this investment because our current systems have come to the end of their useful lives and we want to deliver much more of our services on line and in our residents’ homes. The expectations of our customers are changing and the business needs to keep pace.”
Affinity Sutton launched its business transformation programme, Future Foundations, in 2013. The Programme will transform the way the Group does business. New ways of working and a multi-channel approach to working with our customers will be facilitated by world class CRM and ERP solutions from Microsoft. Staff will interact with a single, seamless IT solution rather than the multitude of different systems currently common in the sector. Customers will be able to enquire and report via a new range of self-service on-line, digital channels enabling much faster resolution of issues no matter which communications channel the customer chooses. It will also benefit staff by providing unified data and a “single version of the truth”.
The solution, which comprises Microsoft Dynamics CRM, Dynamics AX and Hitachi Solutions’ own Field Service Automation software, is a long term investment in the business and will provide a flexible, sustainable platform from which to improve customer services, streamline processes, reduce complexity and support future growth. Making working practices and processes more effective will create a more efficient organisation, delivering better value for money which ultimately means Affinity Sutton will have more capacity to build new homes and invest in its social purpose.
“I am excited that Hitachi Solutions has won this project to fundamentally improve the core processes across the business for Affinity Sutton. Housing Associations are a key market for us and during the bid process we were able to showcase our knowledge of their business and demonstrate how Microsoft technologies can improve customer service whilst reducing costs,” adds Steven French, Executive Vice President, Hitachi Solutions Europe.
Italian cloud infrastructure expert implements Tintri solution in its own Data Centres
LONDON, UK – Tintri, a leading provider of VM-aware storage (VAS) for virtualisation and cloud environments, today announced it has signed a new partner agreement with Italian cloud infrastructure provider, Clouditalia. As a leading VAR that specialises in cloud and telecommunications services, Clouditalia sells predominantly to medium-sized Italian companies.
Clouditalia, founded in 2012, is the first company to offer both cloud and telecommunications services to medium-sized customers and partners across Italy. The company provides a wide range of services from its 100GB fibre backbone and two advanced Data Centres. With demand for cloud services steadily growing and changing, Tintri allows Clouditalia to reinforce its competitive edge with a stronger, more dynamic and high performing technology infrastructure.
Tintri storage is specifically built for virtualised applications, so that organisations can take or automate every storage action at the virtual machine (VM) level. Tintri systems are trusted with hundreds of thousands of virtual machines running business critical databases, enterprise apps, desktops and mobile apps and private cloud deployments.
Doug Rich, VP EMEA, Tintri, said: “We are delighted that Clouditalia has chosen Tintri to differentiate its services. Clouditalia offers Tintri an ideal platform for long term growth in the mid-market. This is a great opportunity to build our respective businesses."
Marco Iannucci, CEO, Clouditalia commented: "The partnership with Tintri is in line with the strategic development of Clouditalia’s offering. We always put our customers’ needs first. By including Tintri’s solutions in our data centres, Clouditalia enables customers to align resources and performance with the needs of each single application. Our customers will enjoy greater control, simplicity and scalability offered by our enhanced cloud services”.
Tintri builds smart storage that sees, learns and adapts, enabling IT organisations to focus on virtualised applications and business services instead of managing storage infrastructure. Tintri application-aware storage eliminates planning and complex troubleshooting by providing VM-level visibility, control, insight and agility, with all flash performance for virtualised environment and the cloud. Tintri powers hundreds of thousands of virtual machines running business critical databases, enterprise apps, desktops and mobile apps, and private cloud deployments. Tintri helps global enterprises such as AMD, F5 Networks, GE, NEC, NTT, MillerCoors and Time Warner maximise their virtualisation and cloud investments. For more information, visit www.tintri.com and follow us on Twitter: @Tintri
Clouditalia is the first Italian company providing integrated cloud and telecommunications services dedicated, above all, to medium-sized companies. Basing its best-in-class service offerings on 14.000 fibre kilometers and state-of-the-art data centres, Clouditalia delivers tailored telecommunications, connectivity, and cloud computing solutions – all in one. In December 2014 Clouditalia deployed the Coriant hiT 7300 Multi-Haul Transport Platform to increase capacity and extend the flexibility of its long haul transport network. This network expansion included upgrades to transmission speeds of 100 gigabits per second (100G) and enhanced connectivity in major network sites, including Rome, Milan, Bologna, and Pisa.