Edd Dumbill may have just won the argument over whether data lakes are a practical, achievable idea.
Data lakes are a simple enough idea: You dump a wide range of data into a Hadoop cluster and then leverage that across the enterprise.
The problem is what Gartner calls the “Data Lake Fallacy,” which is the challenge of managing data lakes in a governable and secure way.
Dumbill acknowledges the barriers to data lake adoption in a recent O’Reilly Radar Podcast. Ultimately, though, the VP of strategy at Silicon Valley Data Science says data lakes will happen for one reason: Data lakes free data from enterprise silos.
“One of the hardest things for organizations to get their head around is getting data in the first place,” Dumbill told O’Reilly’s Mac Slocum. “A lot of CIOs will be, ‘Great, I want to do data science but I’ve got this database over here and this one over here and these all need to speak to each other and they’re in different formats and so on.’ In many ways, having data in a data lake provides you with a foundation (with) which you can start to integrate data with and then make it accessible as a building block in an organization.”
The topic of money – who pays for what and how to get the best plans when business and consumer activities are mixed – has been a vexing one since Bring Your Own Device (BYOD) emerged. It has taken something of a back seat while companies figured out how to keep data secure and separate in the two spheres.
Those primary tasks are well on their way to being solved. Now, attention in turning, as it eventually always does, back to the money. The industry is getting serious about the issue, at least at the rudimentary level of splitting work and consumer bills.
Mobile Enterprise reports that the AT&T Work Platform will enable organizations to separate work and consumer expenditures. The story says that it is an important task from several points of view. Of course, there is the simple point of figuring out who pays for what. Beyond that are the legal, human resources and tax regulations. AT&T is cooperating with big-name vendors MobileIron, AirWatch by VMware and Good Technology on the platform.
The overall storage market has seen a number of challenges recently in achieving desired goals, such as in the number of petabytes vendors actually sell. That has led a few prognosticators to express a “sky-is-falling” analysis (as that attracts attention) to the situation. But that approach is fundamentally wrong.
Now, in any dynamic and rapidly changing market such as storage where trends, such as software-defined solutions and Flash technologies are transforming vendor and customer expectations, and where global IT trends, like cloud, big data, and mobile also have an immense impact, there are likely to be challenges. That is especially the case where both established vendors and newer players duke it out.
The key is not to panic. And that is why it is so important to IBM’s storage customers that the company is staying the course. This does not mean standing still, but rather progressing in a measured manner. IBM’s recent 4th quarter storage announcements do not contain any blockbusters. For that we can be grateful as blockbusters absorb all the attention and we have to expend a lot of thought, time and energy in trying to understand what impact the blockbuster will have.
There are many 'rules' that govern what we do as business continuity professionals – some are sector specific, some are based on geography. But which of them apply to your organization? When you start to look into it, it's not difficult to become confused as to which you are supposed to abide by.
The Business Continuity Institute now aims to simplify this by publishing what we believe to be the most comprehensive list of legislation, regulations, standards and guidelines in the field of business continuity management. This list was put together based upon information provided by the members of the Institute from all across the world. Some of the items may only be indirectly related to BCM, and should not be interpreted as specifically designed for the industry, but rather they contain sections that could be useful to a BCM practitioner.
The ‘BCM Legislations, Regulations, Standards and Good Practice’ document breaks the list down by country and for each entry provides a brief summary of what the regulation entails, which industries it applies to, what the legal status of it is, who has authority for it and, finally, it provides a link to the full document itself.
The BCI has done its best to check the validity of these details but takes no responsibility for their accuracy and currency at any particular time or in any particular circumstances. To download a copy of the document, click here.
(TNS) — After a series of 13 small earthquakes rattled North Texas from Jan. 1 to Wednesday, a team of scientists is adding 22 seismographs to the Irving area in an effort to learn more.
The team of seismologists from Southern Methodist University, which has studied other quakes in the area since 2008, deployed 15 of the earthquake monitors Wednesday. SMU studies of quakes in the DFW Airport and Cleburne areas have concluded wastewater injection wells created by the natural gas industry after fracking are a plausible reason for the temblors in those areas.
But Craig Pearson, seismologist for the state Railroad Commission, said that is not the case with the Irving quakes.
“There are no oil and gas disposal wells in Dallas County,” said Railroad Commission of Texas seismologist Dr. Craig Pearson in a Wednesday email.
The recent Ebola outbreak unearthed an interesting phenomenon. A “mystery hemorrhagic fever” was identified by HealthMap — software that mines government websites, social networks and local news reports to map potential disease outbreaks — a full nine days before the World Health Organization declared the Ebola epidemic. This raised the question: What potential do the vast amounts of data shared through social media hold in identifying outbreaks and controlling disease?
Ming-Hsiang Tsou, a professor at San Diego State University and an author of a recent study titled The Complex Relationship of Realspace Events and Messages in Cyberspace: Case Study of Influenza and Pertussis Using Tweets, believes algorithms that map social media posts and mobile phone data hold enormous potential for helping researchers track epidemics.
“Traditional methods of collecting patient data, reporting to health officials and compiling reports are costly and time consuming,” Tsou said. “In recent years, syndromic surveillance tools have expanded and researchers are able to exploit the vast amount of data available in real time on the Internet at minimal cost.”
Despite tremendous increased attention, the number of reported cyberbreach incidents rapidly escalated in 2014. According to Information Commissioner’s Office data collected by Egress Software Technologies, U.K. businesses saw substantially more breaches last year, with industry-wide increases of 101% in healthcare, 200% in insurance, 44% among financial advisers, 200% among lenders 200%, 56% in education and 143% in general business. As a result, these industries also saw notable increases in fines for data protection violations.
The role of employees was equally alarming. “Only 7% of breaches for the period occurred as a result of technical failings,” Egress reported. “The remaining 93% were down to human error, poor processes and systems in place, and lack of care when handling data.”
Check out more of the findings from Egress’ review in the infographic below:
The future of IT infrastructure is changing. My friend, BJ Farmer over at CITOC, is fond of reminding me that Change is the Only Constant (see what CITOC stands for?).
It’s true for most everything in life, and especially true for our industry. You can either embrace the changes that come along, evolving how you present services to your clients, or you can slowly lose relevance and fade out of the big picture. The choice is yours.
Right now, change comes from The Cloud.
Yes, there is definitely a lot of hype about the cloud, and it’s easy to grumble about fads and look at the big cloud migration as a bandwagon everyone’s too eager to jump on. But the plain fact is that the cloud is providing affordable, smart alternatives to the kind of infrastructure that used to be the bread and butter of an MSP, and it’s not going anywhere. So you can either keep railing against the cloud, running your Exchange servers and piecing together various services from different partners, or you can start thinking about how to offer innovative solutions for your clients by STRATEGICALLY leveraging the cloud.
At the PLI Advanced Compliance & Ethics Workshop in NYC in October, Scott Killingsworth of the Bryan Cave law firm noted that each risk assessment should be unique. I agree, and I believe that the case for uniqueness is even more powerful for the combined program and risk assessments companies sometime undertake. Given the diversity of possibilities, where should you start in scoping out such an engagement? Another way of asking this question is “How should you conduct a needs assessment for a program/risk assessment?”
To begin, it may be worth thinking in terms of the following six fields of information which can comprise the subjects of an assessment:
Continuity Central’s annual survey asking business continuity professionals about their expectations for the year ahead is now live.
Please take part at https://www.surveymonkey.com/r/businesscontinuityin2015
The survey looks at the trends and changes the profession can expect to see in the year ahead.
Read the results from previous years: