A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away. ~Antoine de Saint-Exupery -- Note, the opinions stated here are mine alone and are not those of any past, present, or future employer. --

Sunday, August 31, 2008

Architectural Shelf Life

Architectures are often thought of having an useful life but that life is usually hard to predict. Most of the time it is determined by how much pain an organization is experiencing in trying to extend it to the current business needs. After giving this some thought, I think I can state that an architecture typically has a 10, to at most 15 year useful life. To explain this idea though, I need to introduce another concept:

Architectural Shelf Life - The duration that a collection of patterns and technology are applicable when starting a new system design.

So to elaborate, if you have the chance to start over, complete green field, what architectural patterns and technology do you use. I argue that these change about every 5 years. And from that I derive that any architecture should be replaced in part or in whole about every 2-3 generations of this shelf life. Not convinced of the shelf life argument?

Roll the clock back to 1990 and look at how enterprises would deliver services to their customers. Most offered no access from the customer's location. Some had basic communications in place via email. The more progressive may offer forms via Compuserve, Prodigy, Delphi, or AOL. Applications were implemented monolithically, scaled vertically, and most likely in languages that are waning in popularity or possibly already dead. Databases were also monolithic, using mid tier to mainframe servers with direct attached storage.

By 1995, your customers could begin to find you on the web. The form screens were literally translated to web forms. The application stack and data storage had not changed much. The web was primarily about user interface. Upstart competitors are challenging that with applications written in C++ or some fledgeling scripting languages. Their databases were still largely the same platform as 1990.

If you weren't too busy fixing Y2K bugs coming in to 2000, you would see a significant acceleration in change. Pure web architectures are now common, though the web fronting legacy enterprise applications is still prevalent. Applications are now multi-tiered although still largely monolithic deployments. Scale out architectures are emerging as the preferred way to scale out business logic. Mid tier servers are taking over the job of databases from mainframe and SANs become the preferred way to manage storage. Applications were being written in Java, VB, and new language C#.

Over the past 8 years, databases have evolved to horizontal scaling through sharding. Services have emerged as the preferred design pattern for integrating tiers and components. Java and C# frameworks have matured to provide dramatically better productivity and have been joined by Ruby on Rails and Django to name a few. Distributed storage on low cost unreliable devices is gaining popularity.

Of course some of you will immediately jump to correct my memory or facts. That's not the point but rather the dramatic shift in architectural patterns and technology is what you should take from this. Additionally, looking at the business players that have emerged and those that have faded due to the shifts in technology gives a chilling picture into how much technology disrupts the world of business. The improvements in developer efficiency and the lowering costs of deployment makes it possible to build and operate products cheaper every 5 years. Of course there is a lot more to a successful business than the operating cost but facing competition that has a better cost structure is not a desirable situation.

The challenge you face then is how to continually evolve your platform to adopt new patterns and technologies as they emerge. This can be a daunting when you have a well established customer base and a overflow of business features in the pipeline. Operations will be concerned about maintaining availability through any transition and the business wants enhanced functionality to the product. As long as those two goals can be met with your current architecture, it takes considerable momentum to cause a shift. And in fact, even when it becomes questionable as to the effectiveness of the current architecture, it is not uncommon to still meet resistance to making a major architectural shift.

The fallacy in this situation is that a new architecture will be to disruptive and costly. What is missed is that the new patterns and platforms can be used to disrupt your business anyway. If you have a successful business, many others want a share. And technology becomes the tool they can use to go after your business. And the disruption they can cause by offering your services at lower cost or with more compelling features is far greater than any disruption that will be incurred for internal architectural shifts. In simple terms, if you don't use technology to disrupt your current way of doing business, somebody else will. Therefore, incorporating architectural shelf life and therefore architectural life span into your business is a necessary investment to remain current.

Comments

It is interesting to compare this with security architectures which are very long lived. Kerberos was done in the late 80s, certificates in the 70s and 80s, along with the basic cryptographic underpinnings like RSA. So there is not much new under the sun on the security side.

The problems are of course that 1) as you point out the software architectures change

and 2) the threat models change. Think the folks in the 70s foretold Amazon?