Industry-leading version control and collaboration platform serves as a “single source of truth” for all assets, on and off premises
WOKINGHAM, UK – Perforce Software today announced the availability of the Perforce versioning engine on VMware vCloud® Air™, providing highly performing repositories that serve as a “single source of truth" for all of a company’s global development assets. The combination of Perforce and VMware vCloud Air brings scalability, visibility and security to development environments, along with the flexibility of on- and off-premises production systems.
Perforce uses VMware vCloud Air to support its own global development and has seen first-hand the benefits the two companies now bring to shared customers. The performance of Perforce deployed on vCloud Air was identical to its on-premises deployment using similar resources. vCloud Air allowed Perforce to set up a secure environment with the same skill set used to manage enterprise environments. The combined result is a hybrid on-premises and public cloud environment that provides the security, visibility and compliance required when dealing with business-critical intellectual property.
“IT organizations need to speed up product release cycles to address critical business needs more quickly without jeopardizing security,” said Ajay Patel, vice president of Application Services, vCloud Air, VMware. “Perforce running on VMware vCloud Air allows globally dispersed developers working in a hybrid deployments (on-premises or off-premises) to securely collaborate throughout the development lifecycle, reducing both errors and wasted development time.”
Perforce enables Continuous Delivery, a development methodology that helps businesses release better products more quickly by keeping software in a releasable state at all times. Key versioning requirements for the practice include having a single repository for all enterprise assets, scalability to support highly automated processes, support for collaboration across multi-functional teams, and visibility across all projects.
VMware vCloud Air is a secure, dedicated hybrid cloud service operated by VMware, built on the trusted foundation of VMware vSphere®. The service supports existing workloads and third-party applications, as well as new application development, giving IT a common platform for seamlessly extending its data center into the cloud.
“VMware vCloud® Air™ is an ideal platform for Perforce since it provides the same enterprise-grade security and performance both on-premises and at cloud locations across the globe,” said Christopher Seiwald, founder and CEO of Perforce. “With VMware vCloud Air we can spin up new resources to support the development lifecycle as demand increases and we get the same performance advantages we’ve had with our on-premises systems.”
In the European Union in the past year, a whole range of corporate risk and regulatory issues have been at the top of the agenda, but at the top of my list are data protection and information security.
In this report on risk issues for 2014, I will look at websites, privacy impact assessments, cloud computing and the EU Data Protection Regulation.
Focus on Websites in the EU
In the past five years or so, the European Commission and regulators that focus on consumer protection have carried out regular “sweeps” of websites in order to assess levels of compliance. This trend will continue, and businesses that sell or license content to consumers need to review their online terms and conditions as well as their compliance with other e-commerce rules such as the E-Privacy Directive, E-Commerce Regulations and Distance Selling Regulations.
For example, an EU-wide screening of 330 websites that sell digital content (such as books, music, films, videos and computer games) across the European Economic Area revealed some significant areas of non-compliance.
How many among you out there are sushi fans? Conversely, how many out there consider the idea of eating raw fish right up there with going into to the dentist’s office for some long overdue remedial work? One’s love or distaste for sushi was used as an interesting metaphor for leadership in this week’s Corner Office section of the New York Times (NYT) by Adam Bryant, in an article entitled “Eat Your Sushi, and Expand Your Horizon”, where he profiled Julie Myers Wood, the Chief Executive Officer (CEO) of Guidepost Solutions, a security, compliance and risk management firm. Wood said her sushi experience relates to advice she gives college students now, “One thing I always say is “eat the sushi.” When I had just graduated from college, I went with my mom to Japan. We had a wonderful time, but I refused to eat the sushi. Later, when I moved to New York, I tried some sushi and loved it. The point is to be willing to try things that are unfamiliar.”
I thought about sushi and trying something different in the context of risk assessments recently. I think that most compliance practitioners understand the need for risk assessments. The FCPA Guidance could not have been clearer when it stated, “Assessment of risk is fundamental to developing a strong compliance program, and is another factor DOJ and SEC evaluate when assessing a company’s compliance program.” Many compliance practitioners have difficulty getting their collective arms about what is required for a risk assessment and then how precisely to use it. The FCPA Guidance makes clear there is no ‘one size fits all’ for about anything in an effective compliance program.
One type of risk assessment can consist of a full-blown, worldwide exercise, where teams of lawyers and fiscal consultants travel around the globe, interviewing and auditing. However if there is one thing that I learned as a lawyer, which also applies to the compliance field, is that you are only limited by your imagination. So using the FCPA Guidance that ‘on one size fits all’ proscription, I would submit that is also true for risk assessments.
Napa, Calif., residents were awakened at 3:20 a.m. on Sunday, Aug. 24, by a magnitude 6.0 earthquake that struck six miles southwest of the Northern California city, sending as many as 160 to the hospital, and causing widespread damage, including dozens of broken water mains and triggering six major fires. One person was still in critical condition Sunday evening.
The fires destroyed several mobile homes, and firefighters struggled with water pressure issues since a significant amount of pressure was lost because of the cracked and broken water mains. Most of the damage occurred in downtown Napa where the buildings are older.
There was also significant damage to roads, but the California Department of Highway Patrol and California Department of Transportation found no damage to bridges. The Transportation Department also had dive teams checking local toll bridges but found no damage.
(MCT) — A predawn earthquake rattled Napa, Calif., early Sunday morning, critically injuring at least three people as the shaking ripped facades and shattered windows from historic downtown buildings, toppled chimneys and ignited gas fires at mobile home parks.
Countless residents fled into darkened streets as the result of the quake, measured at magnitude 6.0 by the United States Geological Survey. It was the largest to hit the San Francisco Bay area since the devastating 6.9-magnitude Loma Prieta earthquake in 1989, prompting Gov. Jerry Brown to declare a state of emergency.
The Queen of the Valley Medical Center in Napa reported 120 people seeking treatment soon after the quake. They included a small child who was airlifted to UC Davis Medical Center with critical injuries authorities attributed to a collapsed chimney.
The buildup to fall is in full swing. The next step is Labor Day parades and barbeques and, then, the school busses will begin to roll.
IT and telecommunications never had a real summer slowdown this year, though. Much was done and lots of news was made, and hasn’t even slowed down during the latter half of August. Here is a look at some of the news and more interesting commentary.
"I always imagined a few people on the phones in a small office taking calls, not a big office with actual departments, and definitely not anyone thinking about business continuity and risks." Over the past year I have heard this line said to me in varying forms when I have explained that I give advice on corporate risk and business continuity in the non profit sector.
Not a common misconception and when being able to easily list the risks relevant to the financial services industry for example, applying that to the non profit industry along with the associations of what is important is not as easily obvious straight away.
Some Challenges and observations:
The varying degrees of academia in non profit organisations are expansive and the primary challenge is making it accessible and relatable to all.
The attitudes that this would take too long - it’s not required in our industry and focusing on delivering primary front line services was more important. But has anyone thought about those supporting functions?
"This will never happen to us anyway." At first, it made me feel uneasy hearing this but this is the best challenge to promote business continuity in any industry. Using the "if we don’t comply, we will get fined" card almost shifts the desired affect from wanting to provide great assurance to an exhausting check box exercise. The appetite and denial factor is a tough barrier to get around.
Forgotten plans - in most cases contingency plans were in people’s minds but just not on paper. Hearing various stories of incidents taking place which resulted in an instant panic before the swift realisation that "oh yes, we have a plan, we know what we need to" kicked off a series of reactions to get things back to normal.
Planning V’s practicing - countless months were spent planning and writing but practicing those BCP’s were missing. In recent exercises some feedback I got was that no one had ever tested their plans and found it really useful. The actions that were thought to take five minutes took twenty. This started a chain of actions which plan owners needed to implement in order to become more resilient in an incident. A friend said to me once that businesses don’t fail because of a bad business continuity plans, but because of bad choices. That stuck with me.
So what does BC look like in these industries?
We live in a robust and dynamic society and whilst a generic approach to start off a plan is valuable, they can be adaptable. I quickly realised that I was getting too hung up on wanting to make each teams plan look the same and what really mattered was that it absolutely has to work for the people invoking it, and if it is clear and coherent, that is sufficient.
It is without a doubt that the non-physical threats such as reputational risks, loss of funding from a major donor and employee scandals can have serious impacts on your operation, especially when the majority of funding is provided by the public generosity. If an incident occurred what would be the emergency funding protocol? It is things like this that needs the most consideration. Yes, every industry needs to consider the building, IT/data and staff but what about the intangible factors that essentially calls for a disaster.
Making those threats relatable is key and, the empowerment resulting in a shift in view of risk and business continuity only being related to IT and Financial services is essential. (Because of the varying levels of academics in these industries often sit under one roof).
What does this all mean?
All non profits, for example charities, are run like businesses. Fact!
Non profit or not, business continuity is on everyone’s mind, but they just don’t know that this is what it is. Yes, the variations of levels in what constitutes a threat differs from industry to industry but essentially, what matters most is the resiliency each organisation has to overcome any incident it faces.
RISKercizing until next time
It’s hard to have a conversation in the enterprise these days without the topic veering toward Big Data. What is it? Where does it come from? And what are we supposed to do with it?
But despite the fact that none of these questions have clear answers yet, IT is still tasked with preparing to accommodate Big Data and then figuring out how to derive real value from it.
Part of the problem is the term “Big Data” itself. While large data volumes are a facet of Big Data, that’s not where the challenge lies. Rather, says IBM’s Doug Balog, it’s the need to accommodate the ‘variety, velocity and veracity’ that advanced analytics require that will give most managers fits. This will require not only bigger, more scalable infrastructure, but entirely new ways to collect, analyze and store data, which, from IBM’s perspective, will require advanced Power8 architectures married to powerful third-party platforms like Canonical and the various Linux distributions.
Hybrid and All-Flash Storage Appliance Leader Achieves Key Milestones as StorTrends 3500i is Deployed Broadly to Seamlessly Expand Capacity and Boost Enterprise Application Performance
NORCROSS, Ga. – StorTrends® today announced significant momentum as demand for its hybrid and all flash storage arrays for virtual and physical environments escalates. The is the only storage area network (SAN) device to combine solid state drive (SSD) caching and SSD tiering into a single storage appliance. Optimized to support VMware, Microsoft Hyper-V, Citrix and RHEV enterprises of all sizes, the solution delivers dramatic performance and reliability for the most demanding applications including high performance databases, Virtual Desktop Infrastructure (VDI), On-Line Transaction Processing (OLTP), cloud storage and mixed workload environments - at industry leading price points.
According to International Data Corporation (), the solid state drive market will grow from $3.3 billion in 2013 to $10.9 billion in 2018. That represents a 5-year compound annual growth rate (CAGR) of 26.9 percent. StorTrends is experiencing a surge of growth to address the pent up demand for its family of flexible storage solutions.
To support SSD growth and match the dynamics of business, the StorTrends 3500i enables both SSD cache and SSD tier upgrades. The scalable solution enables options to upgrade from 200GB SSD drives to 400GB, 800GB or 2000GB SSD drives without costly forklift upgrades. In addition, by running the tool, customers can easily identify exactly how much flash is required to support their environment. The StorTrends iDATA software runs unobtrusively and analyzes capacity utilization, IOPS usage, reads vs. writes for volumes, network bandwidth, performance, server statistics and more to classify the amount of "hot data" and "cold data" required.
StorageReview Enterprise Lab tested the StorTrends SSD hybrid storage array built for high performance and maximum capacity. During the benchmark testing, which was conducted against well-known Tier 1 storage vendors, the StorTrends 3500i demonstrated top-of-the-class performance.
According to the lab testing, the biggest selling point of the 3500i was its availability in both hybrid and full SSD configurations, as well as the inclusion of both SSD caching and tiering functionality in the same array. This feature boosts overall system performance and is unique in the hybrid storage market as most vendors provide only caching or only tiering. StorTrends offers a variety of configurations for the 3500i, with the typical accelerated configuration using four SSDs for tiering and two for caching. The array supports multiple HDD tiers as well, letting users opt for performance or capacity oriented configurations. With an expansion shelf customers can tune for both, taking advantage of capacity and performance HDDs with the rapid flash layer on top. For enterprises that need even more performance, the 3500i may be configured entirely with flash drives to deliver unparalleled performance.
Key StorTrends milestones include the following:
- Launched StorTrends 3500i hybrid and all flash storage area network (SAN) to combine SSD Caching and SSD Tiering into a single storage appliance.
- Taneja Group Lab validation recognized StorTrends 3500i array as one of the most comprehensive, versatile and cost effective solid state systems on the market. The hybrid and all flash storage appliance delivers solid state performance and scalability for enterprises.
- Unveiled StorTrends PROFIT Program, a comprehensive channel program that gives partners one of the highest protected margins in the industry for StorTrends' All-Flash, Hybrid and Spinning Disk storage solutions.
- Launched StorTrends iDATA assessment tool, a software solution designed to provide an accurate assessment of IT infrastructure performance, capacity and throughput requirements. The StorTrends iDATA tool can assess pain points in an environment before they become disruptive and provide the details needed to make informed storage decisions - while eliminating the need to over-provision costly storage resources.
- SSG-NOW Lab Review identified StorTrends iDATA tool as an "invaluable" tool for accurately assessing an IT infrastructure in order to find and eliminate pain points, avoid business disruption and make informed storage purchase decisions.
"From day one we designed a patented, hybrid and all flash storage array that eliminates performance barriers impacting enterprise applications," said Justin Bagby, Director of StorTrends. "As the SSD market continues to explode, so does the need for scalable, reliable, low cost solid state storage that can deliver the performance improvements that help organizations realize the true potential of their virtual and physical infrastructure investments."
Tweet this: @StorTrends achieves record milestones as SSD Market Grows
StorTrends® from American Megatrends (AMI) isPerformance Storage with Proven Value. StorTrends SAN and NAS storage appliances are installed worldwide and trusted by companies and institutions in a wide range of industries including education, energy, finance, state & local government, healthcare, manufacturing, marketing, retail, R&D and many more. StorTrends meets the challenges and demands of today's business environments by offering key network storage functionality such as unified storage, simplified management, business continuity, disaster recovery, high efficiency and virtualization support. For further information, please visit: http://www.stortrends.com/.
Every organization should have an Emergency Action or Evacuation Plan. Even when it is not required (by the building owner, fire department or occupancy regulations) it is a ‘best practice’ for every organization to plan and practice to evacuate all personnel from the workplace. Often, evacuation focuses on getting out quickly. Surely that’s the most critical objective. . While simple in principle, there are some considerations that should not be overlooked:
Too Close for Safety: The standard ‘rule of thumb’ for Assembly points is at least 200 feet from the evacuated building. This is intended to assure personnel will not be endangered is window glass or other debris falls. Keep in mind that taller buildings may have a wider potential debris pattern. Two-hundred feet should be used as the minimum. Assuring employee safety should be the priority.
Obstruction: When Emergency Services (Fire, police, ambulance) arrive, will they have sufficient room to do their job? Crowds of evacuated personnel shouldn’t impede their work. Emergency services may need room to park and to turn their vehicles around. Make sure Assembly Points are a reasonable distance from entrances and drive paths- and assure personnel won’t interfere.