It’s not often one gets a chance to revisit a past development effort, but it’s just that kind of unique opportunity that makes one’s development life just a little bit sweeter. About ten years ago, the company I now work for, X by 2, was contracted by a client to design, develop, and implement a then-state-of-the-art system. This was an internal Web application that was replacing a client-server system to be used by employees at multiple locations to manage their daily work. This was a wide-ranging application that did everything from guiding employees through processes for working with customers, to complex transaction processing, to preparing documents for printing and mailing. The client was growing rapidly, and their then-current system could not support its growing customer and employee base.

Fast forward to 2014-15, and my company has been engaged by that same client to replace the application we developed back then with something more suitable for their mature, industry-leading business processes.

As a developer, the most interesting part of the whole exercise is the comparing and contrasting of the kinds of things that were or were not possible then with what we can do now. Let’s start with the software architecture for this effort, old and new.

The good news is the validation that a well-designed architecture can stand the test of time. This system did stick around for a long time, providing a solid value to the client. And, over the years, they were able to add many things to the system that hadn’t even been considered when we initially designed it. They were able to do that because, for the most part, the original architecture held together. However after ten years there are certainly things that can now be done better, due to the benefit of hindsight, advances in technology and the normal things that people do to systems they use for a long period of time.

Another benefit of coming back a decade later was that the client had ten years to refine their own processes and learn what’s important to them, and what’s not. Naturally, as their business evolved over those years some things became more, and less, important. As a consultant, this was a chance to move less important things into the background, or remove them entirely, and move more-important things up to the forefront in the system. An example of this was a form that employees were required to complete every time they finished a particular process. They learned that employees were more interested in moving on to the next process than in filling out accurate information, so they’d generally just pick the first option each time and submit the form. Since the results were unreliable, and the form was just considered an annoyance, we made the decision to remove the form, and just record what information we could extract programmatically without asking the user to enter anything manually.

From a technical perspective, the tools we used a decade ago – a custom written framework using XML and Visual Basic 6 – was less capable compared to today’s tool sets. By comparison, the updated system is a single page application built using a modern JavaScript framework with WebAPI on the backend. Additionally, the database access has been updated from stored procedures to Entity Framework and Breeze.

And what do these modern frameworks get us? One advantage is that, because we don’t have to write all the boilerplate code as we did the first time around, it simplifies our own code. For example, a simple request to basic customer information would have required us to write code in all the layers of our application down to the database level in the first incarnation. Now we can make this request on the client-side and the framework takes care of going through all the layers to get the data. The net result is a good example of using something old and something new in the development process, since in some cases we have used the already existing stored procedures that have been working for years, which have been fine-tuned and optimized over the years to meet the performance requirements. The new framework, as capable as it is, doesn’t necessarily give us that control without a lot of extra work. To be clear, it’s not that we couldn’t perform these same optimizations within the framework. In some instances it just would have taken us longer to figure that out how to do so than to use the existing stored procedures. That falls under the “it’s good to have options” category.

Another advantage, which comes from using a JavaScript framework, is that more of the work of the application is done on the client-side, moving processing away from the server and improving scalability. This has brought both benefits and a new set of challenges, as client-side resource use, primarily memory, now has to be considered as part of our design. Becoming more familiar with how browsers use and release memory has been critical to navigating these new challenges.

In part one of this blog I’ve focused on some of the technical aspects of the initiative. In part two, I’ll discuss the development methodologies and business benefits.

This blog entry has been republished with permission.

Readers are encouraged to respond using the “Add Your Comments” box below.

The opinions posted in this blog do not necessarily reflect those of Insurance Networking News or SourceMedia.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access