A frequently-quoted 2013 survey by big data consulting firm InfoChimps found 55% of all big data projects are never completed. More recently, Gartner forecasted that 60% of all big data projects through 2017 will ultimately be abandoned. In a 2015 survey led by PwC, 66% of companies said they gain little or no tangible benefit at all from their information.
Why are big data project failure rates so high? There are indeed key factors that will determine the success of your big data analytics - and I’ll get to those in a minute - but first, a question: Did all of the big data projects that were reported as failures really fail?
Big data writer and technologist Paige Roberts raises this important question in a recent blog article. Her colleagues working in big data professional services all agreed that they had never seen a project completely “fail.”
“Even if the crazy, unreasonable, or non-existent expectations for a project weren’t met,” one colleague told her, “the business always got some benefit.”
Paige continues:
I realized as he explained what he meant that his definition of failure was a lot like mine for archery… only a small percentage [of arrows] hit exactly what you’re aiming at, especially when you’re just learning. The ones that hit a few inches to the left or right, or even the ones that hit in an outer ring on the target aren’t considered failures, just less than perfect hits, and something to learn from to do better on the next shot…
It does make me wonder, how many of those “failed” projects are still being put to good use [somewhere] in those businesses that declared them a bust? How many big data projects fail to succeed by the original criteria set, but succeed in providing far more value to the business than their cost?
Paige Roberts’ point is very well taken. It is a mistake to “grade” a big data project in a binary “success or fail” manner. Rather, it is much more useful to evaluate where your organization’s big data initiatives stand along a big data maturity model. One such model recently presented by TDWI stands out as particularly helpful, which our white paper explores in further detail.
In a nutshell, there are three key attributes your big data analytics should have to successfully move along TDWI’s big data maturity model:
- Actionable - providing a bridge between big data analytic calculations and specific, measurable business actions, which Nuxeo can help expedite by triggering case management queues in response to analytic model results, as well as provide key metadata to enrich big data sources
- Pervasive - resolving make-or-break corporate level concerns over big data management and governance by leveraging unique existing Nuxeo functionality that ensures enterprise information security, flexibility and scalability
- Operational - freely integrating big data analytics within business processes company-wide, enabled by Nuxeo, while also providing a true digital workplace that drives collaborative decision making
The further you can move along the big data maturity model, the more successful your big data projects will be - and properly recognized as such by others in the organization. The Nuxeo Platform is ready to help you get there.