Just How Big is Big Data? It's All Relative

Lately, the term du jour in IT management circles has been “Big Data,” used to describe the huge surge in information coming into enterprises from all directions, from the Web, sensors, transactional applications, end user files, and an abundance of other places. 

But when we talk about “big data,” is there anything new under the sun? In many ways, Big Data is a relative term that changes over time. Think about our threshold of “expensive.”  In 1980, a jaw-droppingly expensive car in a showroom may have had a price sticker of $28,000. Nowadays, no one blinks an eye at such as price.

Likewise, our idea of what Big Data is also subject to a form of inflation. I remember having a chat about a decade ago with Richard Winter of Winter Group, which tracked the world's largest databases. At that time, there were some gi-normic databases cracking the 1 TB mark. Now, that's almost what one can run on a PC.

A few months back, I attended an industry panel, sponsored by Teradata, which featured a debate on the notion of Big Data. And it is indeed a very fluid term. Donald Feinberg, analyst with Gartner, put it this way: 10 years ago, a company with gigabytes of data would have been considered to be handling Big Data. Today, he said, “If you asked 200 people for a definition of 'Big Data,' you would get 200 definitions,” he says. “Everything is relative.”

However, another panelist said there may be a threshold as to what constitutes Big Data—a moving threshold, but a threshold nonetheless. Mark Jefferey of Northwestern's Kellogg School of Management says that systems begin to behave differently once that threshold is crossed.

“When you get into systems more than a couple hundred terabytes into the petabyte range, the rules are different,” he said. “You have to think about scalability, concurrency and humungous tables. And you need some serious horsepower to be able to do that.”

The ability to store Big Data is evident; but the real challenge is being able to filter, process and turn such voluminous material into actionable insights.

Such capabilities open up possibilities for new types of applications that could come out of capturing and analyzing Big Data, which could be very different and far more advanced than what we have now.

For example, we're seeing carriers putting telematics to work, enabling data to be streamed in from customers' vehicles to help set incremental pricing. A few years ago, carriers' database infrastructures would have been incapable of managing and storing all the data associated with millions of customers' driving habits. Today, it's just another data set.

So, every era has its Big Data. But as our capacity to handle and make sense of this data grows, it is opening the door to some very interesting ways it can be put to work.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

For reprint and licensing requests for this article, click here.
Analytics Data and information management Policy adminstration
MORE FROM DIGITAL INSURANCE