For years, storage has almost been an afterthought, the thing that needs to be set up after all the cool stuff – ecommerce, cloud, analytics, what have you – has been put in place.

Storage was always a necessary evil – disks and tapes (still in use by many enterprises) are hardware investments that need to be made every year, chasing after growing volumes of data. IT managers attempted to control these costs through strategies such as data compression, which bought more time. Strategies such as storage area networking promised to distribute storage requirements across a “pool” of devices, but were often experience to set up and administer. Still, year after year, storage devices and arrays needed to be purchased and installed. Even these days, the typical response to big data challenges is to throw more disk at the problem.

A survey of 600 IT executives by Spiceworks finds about eight percent of IT budgets are dedicated to external storage outside of servers (SAN, network-attached storage).

Lately, it seems storage options have suddenly multiplied, offering a vast array of solutions for every need. Cloud is the biggest story when it comes to storage, with many of the largest infrastructure providers – such as Amazon Web Services, Microsoft and Rackspace – offering highly scalable storage sites at pennies per megabyte. Other players, such as Box and Dropbox, offer storage options that double as productivity and collaboration tools.

For on-premises needs, the latest technologies, including flash, solid state drives, and in-memory computing offer new ways to provide rapid access to new data. Still another approach is memory-first architectures, in which databases are first loaded into server memory spaces, then sent to disk.

But storage is much, much more than hardware (and that includes cloud), just as computing is much, much more than servers. What is also needed are intelligent storage strategies that help reduce the burdens of supporting and providing data. Automating many storage tasks is one such approach. Data lifecycle management is another strategy, offloading older datasets out to cheaper, but lower-performing media (tape is the classic example of this, but hard disk drives is downright slow compared to SSD and flash). Newer data is kept within the newer, faster storage media – which is more expensive than traditional spinning disks.

The bottom line is that as more organizations rely on big data and analytics to drive their businesses (which is especially the case in the insurance industry), they will also need to step up their storage strategies.

The world seems to be waking from its storage slumber, with an impressive array of options now available to keep data safe and secure. Again, however, storage is not a brute hardware issue. The best storage strategies require intelligence.

This blog entry has been reprinted with permission.

Readers are encouraged to respond using the “Add Your Comments” box below.

The opinions posted in this blog do not necessarily reflect those of Insurance Networking News or SourceMedia.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access