Last month I talked about cybercrime as big business and how crime rings take advantage of point of sale (PoS) technology to collect and sell the data they gather. I’d like to build on that conversation, using a new study from Hewlett Packard Enterprise (HPE) that takes an in-depth look at the underlying economy driving cybercrime.
I had the opportunity to talk to HPE researchers involved with this report, and they told me that cybercriminals operate their business in much the same way that any other small business person does. They seek out people who are skilled in different areas – not just computer programmers, but also, say, those with good financial chops or a talent for marketing. They recruit and vet potential employees. The biggest differences between their business operation and yours are that theirs is involved in illegal activities and it is all done anonymously. That’s right – these folks operate under their online alias so you probably will never know anyone’s true identity. It’s a business model that is based primarily on trust and reputation within the Dark Web.
Why should you care about these cybercriminal business ventures? They are your competitors, according to Kerry Matre, senior manager, Security Portfolio Marketing with HPE. Maybe they aren’t going head-to-head with you in a specific industry, but they are looking at how you use technology and the type of data you collect in the course of everyday business, and they are coming up with ways on how to target attacks against that data.
This spring will mark the fifth anniversary of the devastating tornado that struck Joplin, Mo., on Sunday, May 22, 2011. The tornado killed 161 people and caused nearly $3 billion in damage. Keith Stammer was the Joplin/Jasper County director of Emergency Management and is today. He talked recently about the recovery and lessons learned in Joplin.
This year marks the fifth anniversary of the 2011 tornado. How has the recovery gone?
Recovery is going pretty well; everything is cleaned up. We got that done in short order. The problem here is coming back with housing. Joplin has more rentals than it has homeownership, so we have a lot of low- and moderate-income people who need places to stay. If you’ve ever done that, particularly with state and federal tax credits, it takes a while.
We were warned that this would take some time, but I was hoping it wouldn’t take as long as they thought. That being said, we’ve gained back what little of the population we lost. We actually have a few more residents than we had prior to the tornado, and unemployment is running under 5 percent. The other big thing that helped Joplin was that we basically live off sales tax and not off property tax, and the sales tax did not go down in terms of revenue. In fact, it went up because everyone wanted to rebuild. So that helped us from a financial standpoint in terms of not losing anything.
The Internet of Things (IoT) is gaining momentum across industries as organizations strive to compete using data. Gartner estimates by 2020, 25 billion connected "things" will be in use. Whether it's weather monitors out in the field or wearables, companies are getting insights that were previously not possible and achieving new levels of automation. The question is whether the devices are enterprise ready.
"Enterprises adopting IoT devices have to support enterprise standards with authentication, encryption, and protocols," said Andy Beier, director of engineering at BI software vendor Domo, in an interview. "The greatest barrier to IoT data flow is that these devices are not created with an enterprise standard, making it more difficult for companies to benefit."
Even when IoT devices are built for enterprise use, there's no guarantee they'll work together. In smart commercial buildings, for example, different manufacturers are working to get their devices to communicate via APIs or an orchestration platform, but the process isn't necessarily plug-and-play or any-to-any simple yet.
Global companies have been embracing socially responsible spending projects to build stronger relationships with local communities. The idea makes a lot of sense and real projects can result in real benefits.
As with any significant source of money, there are risks. Major global companies have been caught in some embarrassing situations, some of which can have real legal and reputational consequences.
Think of the irony of these situations – in an attempt to promote the goodwill of the company in emerging markets, companies spend large amounts of money, only to find out later that foreign leaders have lined their pockets with the funds to the detriment of the locally intended beneficiaries.
The value proposition of the public cloud is pretty clear. Indeed, there are few companies today that aren’t taking advantage of it in some way. The benefits of a private cloud can be a bit more challenging to define.
Jim Rapoza, editorial director and senior analyst at the Aberdeen Group, has seen the innovative ways in which many companies have effectively implemented a private cloud. Here, he shares some of its use cases, and recommends what companies should focus on when building one.
According to Rapoza, one of the main reasons to implement a private cloud is to gain better management over your virtualized infrastructure and be able to better provide services to end users and the business.
The enterprise has seen many a storage war over the decades, or perhaps it’s more accurate to say many battles of a single storage war. The latest of these pitted the rival cloud providers in a contest to see who could deliver more capacity at the lowest cost.
But even as this phase is winding down, a new one is emerging for the heart and soul of Big Data and IoT data preservation. And the field of battle is no longer on the drive level but in memory subsystems, which are proving to be a lot more versatile than their traditional roles as high-speed cache and random access devices would suggest.
The big breakthrough came earlier this week when IBM announced major improvements to its phase-change memory (PCM) technology that boosts performance way past Flash technologies on a number of key parameters while maintaining relative price parity. According to a paper presented to the IEEE International Memory Workshop in Paris, the company says it can now reliably store three bits per cell in a standard 64k-cell array that has been pre-cycled more than a million times and maintained at temperatures as high as 167°F. This provides a write endurance that is a thousand times better than Flash while at the same time maintaining random access and write-in-place capabilities that Flash does not have. The company plans to implement the technology as a cluster-level and data center solution, pairing it with low-latency networking for data-intensive applications. (Disclosure: I provide web content services for IBM.)
As mobility has enabled us to work anywhere, the spaces we occupy are now material to the productivity and outcomes we achieve. Quite simply, these spaces and their attributes have an effect on how we work.
Collaborative, activity-based work has become the new default workstyle. It not only embraces the concepts of increased consumerization and mobility, but also the human need to work closely with others.
There is a growing delta, however, between the experiences that we achieve when we collaborate remotely using tools, like GoToMeeting or Skype for Business, and the experiences we have when collaborating physically, in meeting or conference rooms.