Fall World 2016

Conference & Exhibit

Attend The #1 BC/DR Event!

Spring Journal

Volume 29, Issue 2

Full Contents Now Available!

Jon Seals

Logicalis SMC and HPE combine to accelerate digital transformation via a complete, rapidly deployable big data ITSM solution

 

LONDON – Logicalis, the international IT solutions and managed services provider, has announced the launch of IT Custodian, a turnkey ITSM solution developed jointly with technology leader Hewlett Packard Enterprise (HPE) and Logicalis’ Service Management Consulting (SMC) business to advance the digital enablement of large organisations. The Logicalis SMC best practice solution utilises core HPE ITSM excellence leveraging HPE Propel technology, and centres on a prebuilt, standard process model that promises fast and successful implementation at a fixed cost. IT Custodian is aligned to the Open Group IT4IT™ framework, and available on-premise or as a cloud service.

 

“Service Management is more than capable of achieving transformational performance at the speed of digital innovation, but traditional approaches to extending the service desk and embracing ITSM can be difficult to budget and take many months to implement correctly. This is far from ideal at a time when IT departments urgently seek to regain control over IT services and become the ‘internal service provider’ to the business,” explained Martyn Birchall, Director, International Service Management Consulting at Logicalis. “With IT Custodian, instead of losing time adapting ITSM technologies to meet their bespoke needs, organisations can rapidly adopt a best of breed, best practice model relevant to their business challenge, which is based on lessons learned with hundreds of major organisations.”

 

According to recent Logicalis research[1] highlighting the effects of the so-called Shadow IT phenomenon, 31% of CIOs globally are now routinely side-lined when it comes to making IT purchasing decisions. IT Custodian comprehensively addresses these and other governance issues, within a fully-supported framework that leverages ITSM best practice gained from over 17 years of Logicalis Service Management consultancy experience.

 

“The IT Custodian solution extends the benefits of HPE Service Management technology with a ready to adopt implementation model that includes everything a service-defined enterprise needs for ITSM in a single solution,” said Kevin Leslie, EMEA Director of Service Portfolio Management at HPE. “Enterprise CIOs and line-of-business executives now have a proven, repeatable approach to Service Management that curbs the risks associated with shadow IT and delivers the benefits of a rich, dynamic, multi-source IT environment with full budgetary control and governance.”

 

For more resources about the Service-Defined Enterprise (SDE), including a forthcoming workshop series for CIOs and IT directors, visit www.uk.logicalis.com/sde.

[1] Logicalis Global CIO Survey 2015 http://www.logicalis.com/knowledge-share/downloads/cioreport-2015-the-shadow-it-phenomenon/logicalis-cio-report-2015/

 

ABOUT LOGICALIS

Logicalis is an international IT solutions and managed services provider with a breadth of knowledge and expertise in communications and collaboration; data centre and cloud services; and managed services.

Logicalis employs over 4,000 people worldwide, including highly trained service specialists who design, deploy and manage complex IT infrastructures to meet the needs of over 6,500 corporate and public sector customers. To achieve this, Logicalis maintains strong partnerships with technology leaders such as Cisco, HPE, IBM, CA Technologies, NetApp, Microsoft, Oracle, VMware and ServiceNow on an international basis. It has specialised solutions for enterprise and medium-sized companies in vertical markets covering financial services, TMT (telecommunications, media and technology), education, healthcare, retail, government, manufacturing and professional services, helping customers benefit from cutting-edge technologies in a cost-effective way.

The Logicalis Group has annualised revenues of over $1.5 billion, from operations in Europe, North America, Latin America and Asia Pacific, and is one of the leading IT and communications solution integrators, specialising in the areas of advanced technologies and services.

The Logicalis Group is a division of Datatec Limited, listed on the AIM market of the LSE and the Johannesburg Stock Exchange, with revenues of over $6 billion.

For more information, visit www.uk.logicalis.com

FalconStor experts offer tips on architecting modern data centers for hyper-scale requirements

MELVILLE, N.Y. — Storage workloads in modern data centers increasingly require scale-out environments to run demanding enterprise applications. These hyper-scale architectures can benefit greatly from software-defined storage (SDS) in terms of economic value, flexibility, and operational efficiency, according to experts at FalconStor Software® Inc. (NASDAQ: FALC), a 15-year innovator of software-defined storage solutions.

Scale-out workloads such as No-SQL, online transaction processing (OLTP), cloud and big data analytics are hungry for performance and capacity to provide appropriate service levels to end users and applications. The architecture of a hyper-scale data center that must grow to meet compute, memory, and storage requirements on-demand often depends on the nature of the applications and business priorities, such as flexible capacity, security, and uptime. Projects typically are driven by the overall cost of ownership, particularly as requirements reach hundreds of petabytes.

“Many modern applications that need these hyper-scale scale-out environments offer built-in resiliency, protect themselves from hardware failures, and can self-heal, which eliminates the need to build in high-availability at the storage layer,” said Farid Yavari, Vice President of Technology at FalconStor. “That opens the door to using consumer-grade, commodity hardware that can fail without impact on service availability. On the other hand, the relatively smaller footprint of revenue-generating scale-up applications may justify paying a premium for name brand storage with HA and data protection features, because it’s unwise to test radical new technologies in that environment.”

Properly architected SDS platforms enable the use of heterogeneous commodity hardware to drive the lowest possible cost, orchestrate data services such as replication, and create policy-driven, heat-map based tiers, so data is placed on the appropriate storage media. An SDS approach eliminates the reliance on expensive, proprietary hardware and vendor “lock-in”.

The two most common models for scale-out hyper-scale storage are either direct-attached storage (DAS), or a disaggregated model based on various protocols such as Internet Small Computer Systems Interface (iSCSI) or Non-Volatile Memory Express (NVMe). Some very large custom data center installations at companies that have the right protocol-level engineering staff run on homemade, workload-specific protocols specifically developed to optimize the storage traffic for their custom use cases. Since the available slots constrain the DAS model in a server, the scale is limited and often quickly outgrown. In the DAS model, independent scaling of compute and storage resources cannot be optimized. Therefore, enterprises start with, or must ultimately move to disaggregated storage models built with commodity hardware. SDS adds intelligent orchestration and management to the disaggregated data center via an abstraction layer separating heterogeneous storage hardware from applications.  SDS results in a more resilient, efficient, and cost-effective infrastructure. The fact that SDS is hardware agnostic allows enterprises to implement new storage technologies in a brown field implementation, eliminating the need for deploying greenfield infrastructure when initially migrating to newer storage models. Using SDS capabilities, the migration from legacy to modern technologies can happen over time, maximizing Return on Investment (ROI) in an already established storage infrastructure. SDS provides flexibility in data migration, seamless tech refresh cycles, and independent scaling of the storage and server resources. Even where data protection and high availability (HA) capabilities aren’t necessary, SDS can provide other valuable features such as actionable predictive analytics, Wide Area Network (WAN) optimization, application-aware snapshots, clones, Quality of Service (QoS), deduplication and data compression.

“Software-defined storage solutions blend well with hyper-scale infrastructures built to meet growing requirements for storage flexibility, density and performance,” said Yavari. “Falling prices of flash, the introduction of various flavors of storage-class memory, and an increasing appetite for commoditization of the data center infrastructure has helped fuel possibilities in hyper-scale storage. SDS enables deployment of storage technologies with different capabilities, at various cost points, to drive the lowest possible Total Cost of Ownership (TCO).”

FalconStor’s FreeStor® delivers enterprise-class, software-defined, intelligent data services combined with predictive analytics across any primary or secondary storage hardware; in the cloud or on premise. FreeStor helps IT organizations realize more economic value out of existing environments and any future storage investments while maximizing flexibility and operational efficiency.

About FalconStor
FalconStor® Software, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged data services software platform that is hardware agnostic. Our open, integrated flagship solution, FreeStor®, reduces vendor lock-in and gives enterprises the freedom to choose the applications and hardware components that make the best sense for their business. We empower organizations to modernize their data center with the right performance, in the right location, all while protecting existing investments. FalconStor’s mission is to maximize data availability and system uptime to ensure nonstop business productivity while simplifying data management to reduce operational costs. Our award-winning solutions are available and supported worldwide by OEMs as well as leading service providers, system integrators, resellers and FalconStor. The company is headquartered in Melville, N.Y. with offices throughout Europe and the Asia Pacific region. For more information, visit www.falconstor.com or call 1-866-NOW-FALC (866-669-3252).

Follow us on Twitter – Watch us on YouTube – Connect with us on LinkedIn

# # # 

FalconStor, FalconStor Software, FreeStor, and Intelligent Abstraction are trademarks or registered trademarks of FalconStor Software, Inc., in the U.S. and other countries. All other company and product names contained herein may be trademarks of their respective holders.

Partnership With ScienceLogic Yields Comprehensive Monitoring and Troubleshooting Across Physical, Virtual, Private-Cloud and Public-Cloud Environments

MILPITAS, Calif. – As enterprises embrace the flexibility, scalability and cost effectiveness of the cloud, they also must confront the operational reality that networks will include legacy on-premise, private-cloud and public-cloud segments. By some estimates, as many as four-fifths of enterprises today have already implemented such hybrid IT approaches, and are challenged to uniformly monitor and troubleshoot that mission-critical infrastructure. To address those needs, Viavi Solutions (NASDAQ: VIAV) today introduced a comprehensive solution for monitoring and packet capture across hybrid networks: Observer SightOps.

SightOps is an addition to Viavi's proven Observer platform, and was developed through the partnership with ScienceLogic announced in November 2015. According to ScienceLogic's annual report on hybrid IT trends, 81 percent of enterprises surveyed had already embraced hybrid IT, yet 62 percent were "flying blind, with little visibility" into that infrastructure. Monitoring solutions exist separately for physical, virtual, private-cloud and public-cloud network segments, making it difficult and time-consuming to create a uniform view, and impeding real-time responses to problems and threats. This lack of visibility creates greater risk for enterprises by raising the chances of a severe network outage that can be disastrous from the cost of downtime -- which can be millions of dollars per hour -- to the amount of dedicated resources it will take to identify the root cause of an outage.

SightOps provides complete infrastructure monitoring that extends from existing legacy infrastructure -- such as physical routers, switches, load balancers and firewalls -- virtualized and private cloud environments, and the public cloud, including Amazon Web Service, Microsoft Azure, IBM SoftLayer, and VMware vCloud Air.

SightOps is also integrated with the Observer GigaStor retrospective network analysis (RNA) system, combining hybrid IT monitoring and packet capture for faster, complete troubleshooting by automating network anomaly investigations, and providing direct access to packets during ad-hoc investigations.

Additional key features of SightOps include:

  • automated discovery, visibility and dependency mapping across hybrid IT
  • only system offering real-time discovery, mapping, and visualizing of storage, network, and compute components of public cloud vendors AWS and Microsoft Azure
  • true multi-tenancy delivering secure, partitioned views to multiple stakeholders
  • runbook automation to streamline and automate key IT processes
  • foundation for Software-Defined Networking (SDN) through support of Cisco Application Centric Infrastructure (ACI).

"Hybrid IT is not an interim state -- it has already become the norm in the enterprise," said Tom Fawcett, General Manager, Enterprise & Cloud, Viavi Solutions. "By delivering the industry's first integrated hybrid infrastructure monitoring and packet capture solution, Observer SightOps not only dramatically improves management of current installations, it also enables organizations to accelerate their adoption of virtual and cloud elements."

About Viavi Solutions
Viavi (NASDAQ: VIAV) software and hardware platforms and instruments deliver end-to-end visibility across physical, virtual and hybrid networks. Precise intelligence and actionable insight from across the network ecosystem optimizes the service experience for increased customer loyalty, greater profitability and quicker transitions to next-generation technologies. Viavi is also a leader in anti-counterfeiting solutions for currency authentication and high-value optical components and instruments for diverse government and commercial applications. Learn more at www.viavisolutions.com and follow us on Viavi Perspectives, LinkedIn, Twitter, YouTube and Facebook.

 

While These Firms Are Incredibly Sophisticated About Data in Their Core Businesses, Ironically They Use Crude and Outdated Sources to Gauge the Carbon Footprint of Their Data Centers, Finds Lux Research

BOSTON, Mass. – Each year, the data centers that power social media, streaming video, cloud computing, and connected devices use more than 90 billion kilowatt-hours of electricity -- enough to power New York City twice over -- and their consumption is still growing rapidly. The companies like Google, Amazon, Facebook, and Apple that run them have the most advanced data analytics tools at their disposal, as well as high-minded public commitments to sustainability.

However, they are reliant on obsolete data tools for calculating emissions due to the electricity they purchase from the power grid. A new analytical tool from Lux Research finds that data centers frequently use far more coal, and thus have much greater emissions, than previously thought.

Today, operators rely on the U.S. Environmental Protection Agency (EPA) Emissions & Generation Resource Integrated Database (eGRID) to estimate their emissions. However, eGRID divides the U.S. electricity grid into just 24 broad regions, and is updated only infrequently -- the most recent information available is from 2012.

"Our team of data scientists analyzed the North American electric grid, improving the accuracy of carbon reporting by a factor of 80. The results show that many sites are far more reliant on coal than reported -- notably, they include many large data centers," said Ory Zik, Lux Research Vice President of Analytics and the team leader of Lux's energy benchmarking.

"For example, we found that Google underestimates its dependence on coal in four out of seven data centers, in particular at its Berkeley County, S.C. location," he added.

The new Lux Grid Network Analysis (GNA) divides the grid into 134 regions, instead of just 24, providing more granular insight, and makes use of U.S. Energy Information Administration (EIA) data that is updated monthly, as opposed to three-year-old annual data. Applying the Lux GNA to U.S.-based data centers shows where operators are coming up short in their sustainability reporting:

  • Google misses the mark in four out of its seven data centers. Google uses eGRID to estimate its electricity emissions, but four of Google's seven major U.S. data centers rely more on coal than the data reported by eGRID implies. As a result, Google's emissions are likely larger than they estimated by 42,000 MT CO2e per year -- the equivalent of 8,500 additional SUVs on the road.
  • Amazon estimates are off in over 20 centers. Amazon is less transparent about how it calculates its emissions, but its 23 Virginia-based cloud services data centers use about 43% electricity from coal -- not 35% as inferred using eGRID. This difference amounts to 85,000 MT CO2e per year more -- some 5,000 households' worth of emissions.
  • The changing grid drives the need for better tools. Investments in renewable energy are growing exponentially, while natural gas is displacing coal, changing the composition of grid generation. The right prioritization of what to do and where, begins with better analytics. With the tools now available, it's time for data center owners to bring to their energy decisions the same data-driven rigor they use in the rest of their businesses.

Lux's energy benchmarking further defines the analysis of data center coal usage and is part of the Lux Research Analytics service.

Resource Information for System Knowledge (RISK)

Resource scarcity, price volatility, and increased social and environmental pressures all threaten supply disruption and unplanned price spikes -- but for most companies, the impact of resource dependence remains increasingly hard to evaluate. Lux Research's Resource Information for System Knowledge (RISK) platform guides companies through this treacherous landscape.

RISK provides the necessary support to help companies benchmark their global operations from the perspectives of profitability, resilience, and sustainability. Companies using the platform can determine where substitution strategies can be deployed, and foresee where breakthrough innovation can have the most impact.

RISK and Data Centers

Organizations make decisions that involve complex value chains (in the case of data centers, largely the electricity grid) and complex resource interactions (for data centers, the interaction of distributed generation and the grid). Companies need to think about resources outside the four walls of their facilities to mitigate resource cost, risk, and environmental impact. For data centers, only about 12% of their carbon emission is within the four walls of their facility; the rest of the losses (and emissions) come from resources procured from elsewhere. Download the complimentary White Paper, "How Dirty is Your Cloud?"

About Lux Research

Lux Research provides strategic advice and ongoing intelligence for emerging technologies. Leaders in business, finance and government rely on us to help them make informed strategic decisions. Through our unique research approach focused on primary research and our extensive global network, we deliver insight, connections and competitive advantage to our clients. Visit www.luxresearchinc.com for more information.

New Geo Location and Key Fragmentation Features Boost Data Protection by Making It Harder to Snoop

WASHINGTON, DC – Covata Limited (ASX: CVT), a global leader in data­-centric security solutions, announced today that its new 'Key-as-a-Service' (KaaS) offering will initially be deployed with Tech giant Cisco. The service will eliminate the burden of encryption key and access policy management, as well as guarding against upcoming legal and regulatory issues, through patent pending techniques such as key fragmentation and GeoLoc. With the amount of data due to increase exponentially as the Internet of Things (IoT) takes off, securing the keys and achieving data sovereignty will be absolutely vital to avoid hackers using these devices as a vehicle to compromise corporate and personal networks. Cisco and Covata are already looking into a range of IoT and cloud projects that will determine the first Covata KaaS customer to embed the service into their products.

Covata KaaS provides a scalable and tailored framework for protecting content wherever it resides, supporting policy driven access and allowing for full access remediation. Its patent pending Geo Location process shifts the focus from trying to control where data is, to controlling the location of the encryption keys. KaaS eliminates the complex, hitherto unresolved issue of protecting billions of pieces of unstructured data flowing around the Internet. KaaS provides an open standard to allow this protection to happen in the background with very little overhead. If access is requested within the data sovereignty jurisdiction, then the policy permits the release of the key. The data also never passes through the key server, as it is encrypted on the device and only the authorised intended recipient can decrypt it. This also ensures a full audit trail for compliance.

In practice, a company may use the geo-policy to block any keys being issued to data that is not physically within a specific countries borders. For example, if an employee is travelling to a foreign country and their phone or data becomes compromised on local networks, the hacker will not have access to data as the key will not be issued while the device is in in that country -- Covata GeoLoc.

Covata KaaS also offers key fragmentation. The master key is fragmented and then delivered to four different data custodians; within Covata KaaS, the government or duly authorized agent of the government from where the generating key service resides, tenancy owner or its nominated escrow agent, and the auditing firm of the Covata KaaS. This ensures that only the federal or authorized parties can access a whole key by obtaining a duly presented court or similar order from the legal jurisdiction of the home key service. In effect, this reassembly process ensures that any government can only gain access to data through a rigorous and open process -- not via snooping.

"The Covata KaaS standard security protocol is the only way to maintain the integrity of the open Internet," said Trent Telford, CEO of Covata. "The Internet was designed to share information, it wasn't designed to be secure, which is why continually bolstering perimeter defences is like trying to beat a wild horse into submission. We're offering a modern approach to security that enables everyone who enjoys the creativity, openness of the Internet to continue to do business securely with the same ease and flexibility they expect. Data sovereignty is achieved through key sovereignty -- a genuine security revolution and one we think will solve this major 21st century hurdle."

"The Internet of Things is going to blow the number of machines connected to the Internet out of the water, and hugely increase the number of transactions we see occur online. A vast amount of this unprotected data is continually being sent to cloud or big data services where it is aggregated and used for decision-making. With much of this data being either moderately or extremely sensitive, it is creating a potential goldmine for hackers. The ramifications of this data being compromised could range from terrorist attacks shutting down vital operations such as power or transportation systems, through to stealing market sensitive data, which is why IoT needs a new approach to security. Cisco is not just talking the talk in moving to cloud and IoT, but walking the walk like no others," concluded Telford.

About Covata

Covata enables true ownership and control over your data in the cloud and over mobile services. We deliver data-centric security solutions without compromising simple usability, providing true end-to-end security. Your data is always protected wherever it may travel -- inside your network, beyond the domain, to mobile devices and to the cloud -- with granular access controls that extend to external users, view-only restrictions, real-time revocation and complete visibility and auditability. Own Your Data, control your data and choose where it is stored -- with complete assurance that it is protected and secure. For further information, please visit Covata.com.

Acquisition Will Bring Together Connectivity, Security, Automation and Real-Time Insights for a Complete IoT Service Solution

SAN JOSE, Calif. – Cisco (NASDAQ: CSCO) announced today its intent to acquire Jasper Technologies, Inc., a privately held company based in Santa Clara that delivers a cloud-based IoT service platform to help enterprises and service providers launch, manage and monetize IoT services on a global scale. Under the terms of the agreement, Cisco will pay $1.4 billion in cash and assumed equity awards, plus additional retention based incentives.

Jasper is the industry's leading IoT service platform in terms of number of enterprises and service providers; in fact, many of the world's largest enterprises and service providers are using the Jasper platform to scale their IoT services business globally. With Jasper, companies can connect any device -- from cars to jet engines to implanted pacemakers -- over the cellular networks of the top global service providers, and then manage connectivity of IoT services through Jasper's Software as a Service (SaaS) platform.

IoT brings with it many complexities, such as connecting and securing millions of devices and collecting and analyzing data. Jasper simplifies these challenges and helps customers accelerate the shift to IoT. The Jasper IoT service platform automates the management of IoT services across connected devices and enables companies to create new business models that transform their products into connected services and generate new sources of ongoing revenue.

Jasper develops and provides a SaaS platform with a predictable, recurring revenue IoT business that manages and drives a wide range of connected devices and services for more than 3500 enterprises worldwide, working with 27 service provider groups globally.

The proposed acquisition will allow Cisco to offer a complete IoT solution that is interoperable across devices and works with IoT service providers, application developers and an ecosystem of partners. Cisco will continue to build upon the Jasper IoT service platform and add new IoT services such as enterprise Wi-Fi, security for connected devices, and advanced analytics to better manage device usage.

"I am excited about the opportunity for Cisco and Jasper to accelerate how customers recognize the value of the Internet of Things," said Chuck Robbins, Cisco Chief Executive Officer. "Together, we can enable service providers, enterprises and the broader ecosystem to connect, automate, manage, and analyze billions of connected things, across any network, creating new revenue streams and opportunities."

"IoT has become a business imperative across the globe. Enterprises in every industry need integrated solutions that give them complete visibility and control over their connected services, while also being simple to implement, manage and scale," said Jahangir Mohammed, Jasper Chief Executive Officer. "By coming together, Jasper and Cisco will help mobile operators and enterprises accelerate their IoT success." 

Jasper CEO Jahangir Mohammed will run the new IoT Software Business Unit under Rowan Trollope, Cisco senior vice president and general manager, IoT and Collaboration Technology Group. The acquisition is expected to close in the third quarter of fiscal year 2016, subject to customary closing conditions.

Visit the blog for more information about Cisco's intent to acquire Jasper.

Investor and Media Conference Call:

Cisco Vice President of Corporate Development Rob Salvagno will join Cisco's Senior Vice President and General Manager, IoT and Collaboration Technology Group, Rowan Trollope, and Jasper CEO Jahangir Mohammed, to host a joint investor and press call on February 3 at 2:00 PM PST to discuss the proposed transaction. To view the webcast go to: http://edge.media-server.com/m/p/3jnsipgu. The dial-in number is +1 773-756-4602 (international) and 888-810-6801 (United States) passcode: 6383252. Conference call replay will be available from 4pm PST on Wednesday, February 3rd. The replay also will also be available via webcast on the Cisco Investor Relations website at http://investor.cisco.com.

About Cisco
Cisco is the worldwide leader in IT that helps companies seize the opportunities of tomorrow by proving that amazing things can happen when you connect the previously unconnected. For ongoing news, please go to http://thenetwork.cisco.com.

Forward-Looking Statements

This press release may be deemed to contain forward-looking statements, which are subject to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995, including the expected completion of the acquisition and the time frame in which this will occur, the expected benefits to Cisco and its customers from completing the acquisition, and plans regarding Jasper personnel. Readers are cautioned that these forward-looking statements are only predictions and may differ materially from actual future events or results due to a variety of factors, including, among other things, the potential impact on the business of Jasper due to the uncertainty about the acquisition, the retention of employees of Jasper and the ability of Cisco to successfully integrate Jasper and to achieve expected benefits, business and economic conditions and growth trends in the networking industry, customer markets and various geographic regions, global economic conditions and uncertainties in the geopolitical environment and other risk factors set forth in Cisco's most recent reports on Form 10-K and Form 10-Q. Any forward-looking statements in this release are based on limited information currently available to Cisco, which is subject to change, and Cisco will not necessarily update the information.

Cisco and the Cisco logo are trademarks or registered trademarks of Cisco and/or its affiliates in the U.S. and other countries. A listing of Cisco's trademarks can be found at www.cisco.com/go/trademarks. Third-party trademarks mentioned are the property of their respective owners. The use of the word partner does not imply a partnership relationship between Cisco and any other company.

RSS Feed for Cisco: http://newsroom.cisco.com/rss-feeds

BOCA RATON, Fla. – Cleartronic, Inc. (OTC PINK: CLRI) through its subsidiary ReadyOp Communications, Inc. announces that the Tampa Police Department used ReadyOp™ for the planning and operational communications during Tampa's annual Gasparilla Parade this past Saturday. The Gasparilla Parade is the third largest parade in the United States following only the Macy's Thanksgiving Day Parade and the Rose Bowl Parade. More than 200,000 people watched more than 380 floats in this year's event.

The Tampa PD sent more than 17,000 SMS messages during the parade as part of their operational control procedures to more than 1,000 officers in the crowd, including personnel with other federal, state and local law enforcement, fire and emergency medical services.

"This was the fourth year for the Tampa Police to use ReadyOp™ and their use increases every year. Large events like this demonstrate the power of ReadyOp™ and how it enables fast, efficient and simple communications to individuals, teams and groups," said Marc Moore, CEO of ReadyOp™ Communications.

Planning and communications are vital to success for all organizations. ReadyOp provides an easy and efficient, yet powerful capability for organizations, government agencies, universities and other groups to conduct daily operations, to plan for special events and to respond for incidents that may occur.

About Cleartronic, Inc.

Cleartronic, Inc. (OTC PINK: CLRI) is a technology holding company that creates and acquires operating subsidiaries to develop, manufacture and sell products, services and integrated systems to government agencies and business enterprises. ReadyOp™ is a secure, web-based platform providing organizations with a single site for planning, response, communications and documentation of personnel, tasks, assets and activities. Cleartronic currently has two operating subsidiaries, ReadyOp Communications, Inc. and VoiceInterop, Inc. -- www.cleartronic.com.

For further information about this release, contact Rich Kaiser, Marketing Consultant, YES INTERNATIONAL, 800-631-8127, yes@yesinternational.com, and www.cleartronic.com.

Safe Harbor Statement: 

This press release may include predictions, estimates, opinions or statements that might be considered "forward-looking" under the provisions of the Private Securities Litigation Reform Act of 1995. Such statements generally can be identified by phrases such as the Company or its management "believes," "expects," "anticipates," "foresees," "forecasts," "estimates," or other, similar words or phrases.

Metaswitch's Innovative Technology and Go-to-Market Services Transition Provider From Reseller to Operator in Less Than 150 Days

LONDON – Metaswitch Networks®, the pioneering network software provider, today announced that Chicago-based managed service provider Access One has deployed Metaswitch's Business Communications solution to improve and extend its voice infrastructure and support services that include hosted PBX and Unified Communications (UC).

"We moved from reselling services from a wholesale cloud voice provider to deploying our own infrastructure, based on Metaswitch's solution set, in less than 150 days," said Rick Wagner, VP of engineering for Access One. "The economic benefits of building versus reselling are very favorable. With Metaswitch's solution in place, we're better placed to compete and provide a more complete customer experience."

Access One has been providing telephony, data and managed IT services to mid-sized businesses for more than 20 years. Since its inception, Access One's high-touch, personalized service has set it apart from its competitors and helped it maintain a loyal and growing customer base.

"We enjoy working with innovative providers like Access One to provide the technology, training and support programs they need to help them make their customers successful," said Chris Carabello, senior director of product marketing for Metaswitch. "In addition to providing exceptional technology and customer support, Metaswitch takes the extra step of helping our customers create and sell great services so they can grow more quickly, satisfy their customers and bring in more revenue." 

To learn more, read our new case study, "Access One Builds Better Business Communications with Metaswitch."

About Metaswitch Networks
Metaswitch is powering the transition of communication networks into a cloud-based, software-centric, all-IP future. As the world's leading network software provider, we design, develop, deliver and support commercial and open source software solutions for network operators. Our high performance software runs on commercial, off-the-shelf hardware, as appliances or in the cloud. We package this software into solutions that are redefining consumer and business communications and enabling the interconnection between diverse network services and technologies. We also apply our software development expertise to removing network virtualization complexities in the data center, with a solution that easily scales and secures workload interconnection in support of mission-critical IT and real-time communication applications. For more information, please visit: http://www.metaswitch.com.

Copyright © 2016 Metaswitch Networks. "Metaswitch" and "Metaswitch Networks" are registered trademarks. Brands and products referenced herein are the trademarks or registered trademarks of their respective holders.

NEW YORK – SmartMetric, Inc. (OTCQB: SMME) A just released consumer survey conducted by the leading law firm Morrison Foerster has found that 52% of consumers named identity theft as their biggest concern about privacy. The research also showed that 1 in 3 persons change where they make purchases based on concerns over security of their information.

Of the 52% of people who say they are concerned about privacy and data theft, 82% of these people stated they have changed where they shop because of these concerns.

This consumer research data is highly impactful on both online and main street retailers showing an impact on their businesses that in most cases has not been considered or taken into account concerning the impact on future business because of data breaches, said today SmartMetric's President and CEO, Chaya Hendrick. What we see from this survey is that consumer behavior is dramatically affected by data breaches and that anxiety about consumer data safety remains a significant top of mind issue in the minds of the public. This anxiety and concern is why we have created a safer payments card using the power of biometrics, said SmartMetic's President.

SmartMetric has achieved an astounding reduction in electronic component size allowing SmartMetric to shrink the size of a fingerprint reader that fits inside a credit and debit card. The futuristic biometric secured card also has a slim rechargeable battery developed by SmartMetric allowing the card holder to have their fingerprint scanned by the card prior to the card being inserted inside a retail card reader or ATM. 

About SmartMetric: SmartMetric has created a safer and better user validation and identification technology for payment and identity cards using a person's individual and unique biometrics to validate and identify the card user. The company has created a super miniature fully functional powerful fingerprint scanner that fits inside a payments card as well as identity and secure log on cards. Using an internal Cortex processor built in the card the SmartMetric biometric card scans, reads and matches a person's fingerprint in less than 0.25 seconds. The cards internal scanner is powered by a rechargeable battery developed by SmartMetric and also embedded inside the payments and identity card. SmartMetric is a publicly traded fully reporting company on the United States OTCQB.

To view a video of the SmartMetric biometric chip card follow this link:

SmartMetric Biometric Payments Card -- https://youtu.be/zSX59uHoHqU

To view the company website: www.smartmetric.com

Safe Harbor Statement: Certain of the above statements contained in this press release are forward-looking statements that involve a number of risks and uncertainties. Such forward-looking statements are within the meaning of that term in Section 27A of the Securities Act of 1933 and Section 21E of the Securities Exchange Act of 1934. Readers are cautioned that any such forward-looking statements are not guarantees of future performance and involve risks and uncertainties, and that actual results may differ materially from those indicated in the forward-looking statements as a result of various factors.

 

New Turnkey HSP Appliance Delivers Native Integration With Pentaho for Robust Data Integration and Analytics; Simplifies Deployment, Operations and Scaling of Enterprise Big Data Projects

SANTA CLARA, Calif. – Hitachi Data Systems Corporation (HDS), a wholly owned subsidiary of Hitachi, Ltd. (TSE: 6501), today unveiled the next generation Hitachi Hyper Scale-Out Platform (HSP), which now offers native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Combining compute, storage and virtualization capabilities, the HSP 400 series delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management. 

Modern enterprises increasingly need to derive value from massive volumes of data being generated by information technology (IT), operational technology (OT), the Internet of Things (IoT) and machine-generated data in their environments. HSP offers a software-defined architecture to centralize and support easy storing and processing of these large datasets with high availability, simplified management and a pay-as-you-grow model. Delivered as a fully configured, turnkey appliance, HSP takes hours instead of months to install and support production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.

HSP's scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. The architecture also includes a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).

"Many enterprises don't possess the internal expertise to perform big data analytics at scale with complex data sources in production environments. Most want to avoid the pitfalls of experimentation with still-nascent technologies, seeking a clear path to deriving real value from their data without the risk and complexity," said Nik Rouda, Senior Analyst at Enterprise Strategy Group (ESG). "Enterprise customers stand to benefit from turnkey systems like the Hitachi Hyper Scale-Out Platform, which address primary adoption barriers to big data deployments by delivering faster time to insight and value, accelerating the path to digital transformation."

The next-generation HSP system now offers native integration with Pentaho Enterprise Platform to give customers complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, lifecycle management and enhanced information security. The powerful combination of technologies in the next-generation HSP appliance was designed to accelerate time to business insight and deliver rapid return on investment (ROI), while simplifying the integration of information technology (IT) and operational technology (OT) -- a strategic imperative for modern, data-driven enterprises.

"Modern enterprises must merge their IT and OT environments to extend the value of their investments. HSP is a perfect solution to accelerate and simplify IT/OT integration and increase the time to insight and business value of their big data deployments," said James Dixon, chief technology officer at Pentaho. "The HSP-Pentaho appliance gives customers an affordable, enterprise-class option to unify all their disparate datasets and workloads -- including legacy applications and data warehouses -- via a modern, scalable and hyper-converged platform that eliminates complexity. We're pleased to be working with HDS to deliver a simplified, all-in-the-box solution that combines compute, analytics and data management functions in a plug-and-play, future-ready architecture. The Hitachi Hyper Scale-Out Platform 400 is a great first-step in simplifying the entire analytic process."

With HSP, Hitachi continues to deliver on the promise of the software-defined datacenter to simplify the delivery of IT services through greater abstraction of infrastructure, and improved data access and automation. While its initial focus is on big data analytics use cases, the company's long-term direction for HSP is to deliver best-in-class total cost of ownership (TCO) for a variety of IT workloads. Hitachi will offer HSP in two configurations to support a broad range of enterprise applications and performance requirements: Serial Attached SCSI (SAS) disk drives, generally available now, and all-flash, expected to ship in mid-2016.

"We consistently hear from our enterprise customers that data silos and complexity are major pain points -- and this only gets worse in their scale-out and big data deployments. We have solved these problems for our customers for years, but we are now applying that expertise in a new architecture with Hitachi Hyper Scale-Out Platform," said Sean Moser, senior vice president, global portfolio and product management at Hitachi Data Systems. "Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho Platform will help them put their IT and OT data to work -- faster. This is only the first of many synergistic solutions you can expect to see from Hitachi and Pentaho. Together, we are making it easy for our enterprise customers to maximize the value of their IT and OT investments and accelerate their path to digital transformation."

Learn More. Join the Conversation.

About Hitachi Data Systems

Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., builds information management and Social Innovation solutions that help businesses succeed and societies be safer, healthier and smarter. We focus on big data that offers real value -- what we call the Internet of Things that matter. Our IT infrastructure, analytics, content and cloud solutions and services drive strategic management and analysis of the world's data. Only Hitachi Data Systems integrates the best information technology and operational technology from across the Hitachi family of companies to deliver the exceptional insight that business and society need to transform and thrive. Visit us at www.HDS.com.

About Pentaho, a Hitachi Group Company

Pentaho, a Hitachi Group company, is a leading data integration and business analytics company with an enterprise-class, open source-based platform for diverse big data deployments. Pentaho's unified data integration and analytics platform is comprehensive, completely embeddable and delivers governed data to power any analytics in any environment. Pentaho's mission is to help organizations across multiple industries harness value from all their data, including big data and IoT, enabling them to find new revenue streams, operate more efficiently, deliver outstanding service and minimize risk. Pentaho has over 15,000 product deployments and 1,500 commercial customers today, including ABN-AMRO Clearing, EMC, Landmark Halliburton, Moody's, NASDAQ, RichRelevance, and Staples. For more information visit www.pentaho.com

About Hitachi, Ltd.

Hitachi, Ltd. (TSE: 6501), headquartered in Tokyo, Japan, delivers innovations that answer society's challenges with our talented team and proven experience in global markets. The company's consolidated revenues for fiscal 2014 (ended March 31, 2015) totaled 9,761 billion yen ($81.3 billion). Hitachi is focusing more than ever on the Social Innovation Business, which includes power & infrastructure systems, information & telecommunication systems, construction machinery, high functional materials & components, automotive systems, healthcare and others. For more information, please visit the company's website at http://www.hitachi.com.

HITACHI is a trademark or registered trademark of Hitachi, Ltd. All other trademarks, service marks and company names are properties of their respective owners.