Can Software-Defined Data Centers Solve the Industry's Legacy Problems?

Can the data center be removed from the hardware underneath of it? Enterprises are certainly ready to try, with many looking at the possibilities offered by software-defined data centers (SDDCs).

“SDDC may become one of the most disruptive technologies to impact enterprise data centers, and it has the potential to rapidly change the market landscape, as well as change the way data centers are funded, designed, provisioned and managed,” said Frank Ohlhorst in a new report published for GigaOm Pro.

As Ohlhorst explains it in the report, SDDC consists of three virtualized — and highly automated — components: servers, networks and storage. While there is a lot of common ground with private cloud computing, he cautions that SDDC should stand on its own as an initiative to abstract software and applications from hardware, versus pitching it as what he calls the more “nebulous” cloud concept.

Here's how SDDC addresses these three major layers of the data center:

Servers and computers: Server virtualization is already a well-established technology that many companies have been implementing in efforts to consolidate and reduce the spread of their data centers.

Networks: Ohlhorst credits the OpenFlow movement — which introduces “a standards-based protocol for network virtualization that can be implemented by any vendor for either open-source or commercial products” — as the catalyst that has helped software-defined networking (SDN) to catch on at many organizations. The potential is vast, as many IT managers are often faced with configuring and managing complex arrays of bridges, routers, modems and other network equipment.

Storage: While efforts to virtualize storage have been going on for decades through approaches such as storage area networking (SAN), storage “has proved one of the most difficult elements to effectively virtualize,” Ohlhorst states, particularly since it “has been one of the most volatile elements of the data center, with competing solutions, differing ideologies, and increasing demand setting the tone.” Dominance of this space by large vendors has also impeded innovation, he feels. Software-defined storage, or SDS, combined with flash and solid-state disk technology, may inject new thinking into hos storage is managed, a critical concern in the era of big data.

As with any new technology paradigm, there's often a lot of excitement and promise. SDDC is no exception, and Ohlhorst cautions that SDDC is still new, and “few tangible case studies or success stories exist in the public domain.” However, at a time when companies are anxious to dramatically increase their speed to market, but are still saddled by enormous legacy investments — a particularly acute problem in the insurance industry — this concept may provide the ultimate abstraction.

(Disclosure: the author is an occasional contributor to GigaOm Pro.)

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

For reprint and licensing requests for this article, click here.
Analytics Data and information management Policy adminstration
MORE FROM DIGITAL INSURANCE