The New Space Race: Who Can Build the Most Powerful Computer?

Ever since the former Soviet Union launched Sputnik into orbit, the United States and other countries have been involved in a race that saw us reach the moon first, and still awaits the first manned expedition to reach Mars. Fascinating as that is, however, it has been supplanted in scientific circles by another competition—the race to build the fastest computer ever.

According to multiple Internet sources, we and others are now involved in an international race to build an exascale supercomputer, a system that will compute at speeds unheard of at present. The sources note that the most powerful systems today are measured in petaflops, meaning they're capable of quadrillions of operations per second. An exascale system is measured in exaflops, an exaflop being 1 quintillion (or 1 million trillion) floating point operations per second. China, Europe and Japan are all reportedly working on exascale computing platforms.

Needless to say, the nation that beat all others to the moon is also hard at work on such a system, the sources note. The Department of Energy has been working on an exascale-computing platform for two years, although funding to continue the project has not been approved.

Beyond the all-world-nerd bragging rights, just what would having such a system mean to our country, or to others? It will mean, for one thing, that everything we do now with our systems could be done more quickly—in fact, at blinding speeds. From an insurance point of view, that would be interesting. For our own enterprises, the effect of significantly faster processes would mean we could bring products to market much more promptly and that we could respond more quickly to customer and business partner inquiries and problems.

It would also mean, however, that when something bad happens, that also occurs at blinding speeds—perhaps too quickly for us to do anything or to limit the damage. We’re all aware that the world of the Internet is a fragile one, and keeping up with security threats to our valued enterprises is, at best, a neck-and-neck battle with the forces of evil. When everything happens ten or a hundred or a thousand times faster, however, we have to wonder if we will have the ability to neutralize the cyber-missiles that will be fired at our enterprises and our nation. One also wonders if insurers will be eager to grant coverages in light of such threats.

One of the biggest problems we face in developing new technologies is that we are often not ready to handle the very capabilities we seek to achieve. For insurance and financial services in particular, the move toward supercomputing, which seems inevitable, should be a cautious one. The hope is that as the exascale systems proliferate and accelerate the pace of electronic business, we also will accelerate our efforts to protect the data that are so vital to our survival. Until we see that happening, however, we need to put a halter on the runaway exascale thoroughbred, at least in terms of our critical systems.

Ara C. Trembly (www.aratremblytechnology.com) is the founder of Ara Trembly, The Tech Consultant, and a longtime observer of technology in insurance and financial services.

Readers are encouraged to respond to Ara using the “Add Your Comments” box below. He can also be reached at ara@aratremblytechnology.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

For reprint and licensing requests for this article, click here.
Data and information management Policy adminstration Data security Analytics Security risk
MORE FROM DIGITAL INSURANCE