Is 2010 One of the Most Significant Years for Software Architecture?

Modern Software Architecture has been heavily influenced by the need to architect systems at the scale of the Web. While many have focused on creating monolithic thin client architectures, the explosion of connected access points, from mobiles to tablets, to IPTV or embedded devices, is pressuring solution architectures to become open.

Jack van Hoof pointed last week a talk from Joshua Robin. In his talk at the Gov 2.0 2010 conference, Joshua, an IT Architect at the Massachussets Department of Transportation, explained that he has always been puzzled about the "weather service". How come weather information is so widely available across so many channels? He came to the conclusion that it was because of weather feeds that were openly available. So, in September of 2009, his organization decided to publish the MBTA trip planning information and within two months there were 6 applications that offered trip planning information to Bostonians. There were all kinds of applications: iPhone, Web sites, Web Widgets... Later, they held a developer conference, and opened up the real-time bus schedule information. Within one hour, someone had already created a real-time display on Google Earth, after 2 days, there was a google map application within a few weeks there were several other applications, including a street sign, and SMS or IVR phone system. All, at no cost to the MBTA. Joshua sees a bright future as data is unlocked and as we step away from monolithic architectures.

Jack noticed the same phenomenon:

Months before we - at Dutch Railways - published our mobile app to supply travel information, a full high quality equivalent was made available to the public domain by someone we didn't know and we didn't pay.

He concludes on a very optimistic note:

The world is changing rapidly. Witness this great momentum and be part of it. After watching the video below your conception of user interfaces will never be the same anymore. This is only the beginning...

Will 2010 be remembered as the tipping point when Software Architecture became "composite" instead of monolithic? Five years after the first official mash-up was published? Critical building blocks are still worked on, like OAuth, while new types of clients are appearing almost daily. It seems now that the momentum behind composite applications is inescapable. Do you agree, or do you think this is just another fad that is tied to simple data feeds and that will quickly vanish like so many other? Did you witness the same phenomena as you opened your data? Do you see even a brighter future as composition goes beyond the user interface, into processes and data? Are there any building blocks missing?

It seems like you are taking some ideas from the 1970s (structured/composite design) and presenting them as brand new under the guise of "developers finally becoming competent in software architecture".

To quote Winston T. Wolfe in Pulp Fiction, "Just because you are a character doesn't mean you have character."

I stand corrected, you absolutely right software architecture will not evolve until the people that think the one thing or two they learn in the 70s, is still applicable 40 years later. These people reify every modern software architecture concept behind what was available back then, be it MVC or Object Orientation. After all, SGML was invented in 1969. I am not sure what Jack and I and possibly a few others were thinking: how could we get excited about composite applications and a see of services enabled by modern service oriented concepts?

No earlier than today I received an email from a very senior architect that I highly respect who asked me if it was best to adopt an "immutable" service versioning strategy. I replied to him that I actually had seen with my own eyes entire organization fail with an immutable service versioning strategy. I explained to him that this customer had started 5 years ago with such strategy and they were now releasing v26 of their "immutable" services (this is not a joke). Consumers, were arbitrarily stuck on a version without the budget to move up.

What's comforting, is that, our industry always end up doing the right thing, ... after trying everything else (adapted from Churchill)

Yes, exactly, I bet at least 95% of all SOAs are built like that. Even a published author and very senior technical guy working for a very large vendor recommended to my company in 2007 that versioning strategy... I don't want to pick on anyone. The people that made these decisions are very smart. I want to pick on this sentiment that there is never anything new, that all that needed to be invented somehow was invented in the 70s and we should continue doing what we have been doing until all these people - who have so much to lose- retire.

I am not surprised that this strategy is common practice. After reading Thomas Erl's book about WS versioning for SOA, I practically lost my interest in the WS* technology and frustration spread, because my instinct said "this will never work". Too complex for the mainstream to get under control.

Service contracts are still a vital ingredient and I like William's approach away from a validation-centric description. The dreaded XML-Schema based validation is the core problem after all.I have picked up an interest in REST recently, but the question of service contracts has only been partly answered. What's an XML mime type worth if there is no description of its content available?

The 70ies didn't have the WWW. That about says it all, I think. Service composition was not a terribly pressing issue. There were no globally distributed services accessible for everyone.

actually validation has nothing to do with that. You validate (I hope) whether you use XSD for that or not. The problem is "breaking the client". If you make a change to the service and you break all existing clients, that's where the problem is in composite applications. Specially when you have tens of thousands of existing apps are consuming your service. The service developers need some techniques to make sure that changes to the service interface will not break clients (changes in the implementation can also break the client). You need to communicate these techniques to the client applications for the whole scheme to work. Unfortunately when specs were designed and products built, little consideration were given to this problem, why? because in the 70s, as you pointed out rightfully,nobody had this kind of requirements.This is where any distributed computing technology failed, this is where REST will fail too, because the people that apply the REST technologies and the principles still live in the 70s.

You are right, "breaking the client" is the general problem. I was pointing to the most popular validation technique (in the Java WS space at least), which successfully breaks any client in no time :-) The herd is trudging in this direction and I got tired of arguing.

What are we supposed to do to look more 2010ish? What kind of format and format specification is capable of solving this problem?

That would probably be worth a post, in a nutshell and IMHO, first understand the paradigm shift we are currently living: product and services companies need to bring IT in their product and services. You can no longer bring to market products and services that cannot integrate with the information systems of your customers readily. This is what I call SaaS and PaaS: Service as a Software and Product as a Software. (I know SaaS means Software as a Service, this is not a typo, please understand this is a paradigm shift). We entered the composite application era a few years ago, I think 2010 is the inflection point. None of that existed in the 70s.

Second, you'll often be on both sides fo SaaS and PaaS, so you need to understand and start with the architecture and technologies of composite applications, including things like versioning. In particular, you'll need to understand that OO offers zero architectural support, always has, always will. The tragedy is that modern software architectures grew on the side of traditional programming environment while vendors tried painfully to systematically reify architecture within these programming environments. Then people like John, say, look nothing has changed, I can still write code people were writing code in the 70s. True, but fundamentally flawed. Even initiatives like Spring, SCA or OSGi which have successfully brought architecture into programming environments are too weak to support the composite application wave that is coming.

Honestly, I think applications have always been at least semi-composite - we can do things much faster and simpler now, surely that is true, but even before JSON, REST, Web Services, CORBA, MQSeries, there were simple file updates, or database integration, in fact ftp is still very commonplace way of sharing information between disparate systems.

While I very much like the idea that the MBTA is providing this service, to use them as an example, couldn't they have kicked out a flat file to a public ftp server somewhere and achieved the same thing?

Granted, there are monolithic systems out there, I would put SAP, Oracle, and Peoplesoft in that category, and they have their obvious advantages.

On another topic, it is great to see folks like the MBTA realize the value of their data, before someone like Google "generously" offered to help them offload all their processing to Google, and in exchange Google could sell their information. Unfortunately, many State and Federal agencies have little idea of the value of their information, the presenting of which, could do much do bring in needed income. The State of Michigan has an interesting system where they connected a Payment Manager and a Commerce Engine so that citizens could search their UCC data, adding their queries to a "shopping basket". What would be really interesting is if they could provide at least the free service as an API.

The ideas may be not be new, and may well be around since the 70's but I don't think that composition of services has been achieved at this scale ever before. The sheer number of different platforms, types of devices and number of consumers/clients using these services is tremendous. For the first time in the IT history we have ubiquitous service addressing and consumption through Web technologies.

Also, it is very different to use a pre-existing component that you have to build and deploy together with your own solution, from just addressing something already available, deployed and managed by someone. This is not component based development, even though it can be loosely considered to be under the same "architectural" patterns.

Sure, sometimes we go back to basics in IT and seem to be reinventing the wheel, but I believe this happens because the industry took a wrong turn somewhere (usually because it overcomplicated the solution to a common problem - as I believe is the case with SOA and SOAP).