In my frequent engagement with business, regulatory and political leaders on the topic of big data I am often asked: "Who is getting Big Data right, and what are they doing differently to get positive results?"
I'm asked this question often enough to get the sense that few organizations seem to be seeing positive results from their big data efforts. This in spite of the fact that they are investing millions of dollars, spending thousands of hours and betting their business' future on the success of these analytic efforts.
While I've tried to articulate how I have seen some organizations get big data "right," it's frequently more compelling to explain how others seem to be getting it terribly "wrong." In this vein, I offer the following six signs that an organization will likely fail at big data, and a bit of guidance on how not to join them.
1. Management Wants to See an ROI Before You Start
Much of the recent hype surrounding big data focuses on disruption; how data analytics can lead to insights that allow businesses to completely transform themselves. The definition of disrupt in Webster's dictionary reads:
to cause (something) to be unable to continue in the normal way : to interrupt the normal progress or activity of (something)
Businesses have been collecting, analyzing and learning from data for decades, which implies that something must be different this time around. Somehow the processes, tools, and techniques used in data analytics for the last half century have played out, and we need to achieve something better; something disruptive.
If disruption leads to results that aren't normal or expected, how in the world can you possibly make a prediction of its business value? How can you possibly figure out how much disruption a given change is going make, if both the disruption and its effects are DISRUPTIVE? Isn't unpredictability a synonym for disruption?
If you follow this line of thinking, you'll see that the standard practice of developing a return on investment (ROI) analysis for big data is a most asinine exercise. How in the world is it possible to predict the unpredictable with any degree of confidence, and how can such a request be defended as rational?
Indeed, if you make a prediction as to how much disruption your disruptive efforts will achieve and you're proven correct in your prediction, have you actually disrupted anything at all? If I meet my expectations, have I actually made any real change, or have I merely achieved a new, higher degree of normal; of predictable progress without any actual disruption?
When I meet executives who are developing their big data strategy and they ask me what ROI to expect my usual answer to them is, "Yes, there will be an ROI, as long as you do it right." This frequently confuses them, as they've been trained over several decades to expect some falsely-precise answer to the question, "What's the ROI of this project?" I explain to them that any such estimate, if achieved, is the definition of a failure to disrupt. If they don't get what I'm saying within a couple of minutes I'm usually pretty certain that whatever they subsequently invest in will fail to generate much, if any, disruption.
2. Your Data Warehouse Team is Leading the Effort
If ever there was a fox in the henhouse in technology, it is when data warehouse experts are asked to do big data. If I'm going to get disruptive results I need to use different tools and techniques to ask different questions of different data. That's a whole lot of "different" to ask of people who have likely profited greatly from being "normal" for one or more decades.
Your business intelligence, data warehouse or analytics team has been doing the same old thing with the same old tools and same old data for a very long time; and they’re likely very good at doing what they do. It is likely that they are experts at what they do with lots of expensive training, certifications, knowledge and experience at doing things the same old way. Asking them to change every aspect of what they do and how they do it is an awful lot to ask of them, and it can be pretty scary. Indeed, I tell people all of the time that if they are doing big data, and they aren't scared, they aren't doing it right.
If you assign your big data efforts to your existing analytics or warehousing team you'll likely find one of two outcomes; either your team of existing experts willingly throws away their decades of experience, expertise, knowledge and preconceptions, adopts a completely new world view, and changes everything that they do and how they do it, or they pretend to do all of these things while not changing one iota. Which sounds more likely to you?
3. Marketing is Leading the Effort
Many business people recognize the challenge faced by IT people in adopting an entirely new way of thinking and doing, as described above. In response to this issue, it's fairly common that the business, and more particularly marketing, circumvents their own technology people entirely and deploys what is commonly referred to as "rogue IT."
An entire industry is cropping up to facilitate this trend, and you're likely seeing more and more vendors providing, "Analytics as a Service," "Data as a Service," or optimistically, "Results as a Service." This all sounds wonderfully simple to marketing executives who are completely uninterested in knowing what is going on inside of the little white box labeled the "Data Lake" inside of some architecture diagram. However, those with a degree of technical prowess understand that delivering on the promise of big data is remarkably complex; architecture diagrams notwithstanding.
Many companies that I work with are driving their big data efforts through their marketing departments. They are handing their budget to outside vendors who promise much yet are frequently challenged to deliver upon these promises. Many of these executives find that even when they do gain valuable insights about their customers, the rest of their organization is completely unprepared to act upon these insights. Even if insights are achieved, they're impossible to monetize. Is it any surprise that big data is entering a period of disillusionment?
4. Your First Step Was to Pick a Technology
If there is one activity most Information Technology departments have perfected over the last 50 years, it’s the process of tool selection. From investigation to down selection; from creating Requests for Information (RFIs) and Requests for Proposal (RFPs) to selecting a winning bid, IT has become a well-oiled tool-picking machine.
This is very evident in the world of big data, where picking a platform, tool, vendor and architecture seems to be consuming vast amounts of time and attention in the offices of CIOs the world-over. Indeed, many CIOs seem to believe that once they’ve selected a particular package in which their data will be analyzed their work is done.
This approach frequently leads to a grim reality. Once IT starts down the path of tool-specin' and vendor-pickin' the die is likely cast and they'll achieve the same old results as they always have. In order to achieve disruption you must disrupt yourself. Hence, if your IT department approaches this problem the same way they've approached all the problems they ever faced before what's the likelihood they'll achieve a disruptive result? What seems the most reasonable answer here?
5. Your Second Step Was to Hire a "LinkedIn Data Scientist"
Some of the companies I work with have recognized that it is extremely difficult for their internal experts to think disruptively. In recognition of this, they look to outsiders to help them on their journey to big data disruption. The challenge here is that big data has experienced so much hype by now that it's becoming nearly impossible to figure out who actually knows anything about this stuff.
Enter the search terms "big data" or "data scientist" on LinkedIn and you will likely get tens of thousands of results. eople with any range of education and work experience are now claiming to be data scientists, because the demand for expertise has rapidly outstripped supply. What makes this problem worse is the fact that many organizations don't know enough about "data science" to accurately assess whether or not their data science expert, is.
6. You Won't Have Results for another 4, 5 or 6 Months
Finally, I talk with a number of organizations who, according to their plan and their ROI analysis, expect to see some business results from their big data effort in four or five or six months in the future. This is a recipe for disaster, and stems directly from their making one or more of the prior five mistakes.
To succeed at big data you need to first do it wrong. You need to fail, learn, adapt and re-try in rapid succession. Rapid iteration is the key to success here, and anyone creating a plan for a strategy for an approach to an RFP for a technology pilot is likely setting themselves up for an epic failure. To win at this you must remain nimble, gain insights quickly, fail fast and small, and then respond to those failures even faster. This is how you prime the pump of disruption and ensure that you reach and maintain the degree of change that Big Data is forcing upon organizations. If your current plan calls for several months of effort, after which some sort of results are predicted to be available, be afraid.
Christopher Surdak is the author of the book "Data Crush: How the Information Tidal Wave Is Creating New Business Opportunities", winner of GetAbstract's International Book of the Year, 2014; and Winner of the WhartonDC Club's Benjamin Franklin Innovator of the Year, 2015.
Register or login for access to this item and much more
All Digital Insurance content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access