11 comments:

Moritz Berger
said...

Abstraction-coherency failures (sorry, got to find a better name): similar to partitioning failures, but vertical instead of horizontal. I.e. due to inconsistent distribution of functionality across abstraction layers, composing applications breaks layer containment of constraints and dependencies. The hallmark of good/bad design, prominently encountered in frameworks and general purpose components and libraries.Perhaps related to Synergistic failures (which appear to be the CATCH for everything that doesn't belong in the other categories ;-))

Moritz,Thanks for your comment. One point I should make about synergistic failures is that these failures are actually the best defined of all failures. It involves a test for synergy between two (or more) functions. In the few cases in which it is not obvious whether synergy is or is not present, one has a simple problem that can be escalated to one with a better business overview of the problem domain.

Now as to your proposal for abstraction/coherency, this is an interesting idea. I agree with you the systems that fail this test have problems. The question I have is, Is "complexity" the best possible description of that problem? We don't want to use the word "complexity" to describe everything that can go wrong with a system. For example, a system could be bad because it doesn't meet business requirements, yet still not be complex in a mathematical sense.

I think complexity is something that very large organisations with unclear or varying business goals can't fail to feed.For example in these times of uncertain financials companies are looking to cash in on tactical initiatives rather than taking a measured , prescripted approach using a framework such as TOGAF to develop an enterprise architecture.As a result of this the IT estate is under constant change with many conflicting requirements often making a mess even more messy.So I would say a major cause of complexity is the business areas not really knowing where they want to be and not knowing how to partition itself.In the words of George Harrison, if you don't know where your going any road will take you there !

One reason that companies aren’t using a “measured, prescripted approach using a framework such as TOGAF” is that the time and money investment to do so is so large. Often it takes so long to create an enterprise architecture that by the time it is created it is out of date.

In a recent talk I gave comparing the various EA methodologies, I categorized methodologies into three generations as follows:

Generation OneTime Period: 1987-1995Mantra: We need to align IT and business!Representative Methodologies: Zachman

Thanks for your comments.Do you think that the concept of alligning IT and the business is no longer relevant or valid ?.(Sorry for being anonymous I need to set an account up).DO you agree with my assertion that continual reactive change is a source of (or driver towards) continuing complexity?

I believe that aligning IT and the business is extremely important. It’s just that I believe that you need to partition the enterprise into simple subsets (autonomous business capabilities, in my terminology) and then achieve the alignment within those subsets.

As far as continual reactive change, I don’t see that so much as a source of complexity as much as a symptom of complexity. When complexity is out of control, there is little you can do except be in continual reactive mode. Once we properly manage complexity, we can take the time to be more thoughtful and deliberate.

Maybe I'm addressing something that's outside the scope of this conversation, but my initial reaction to your "Five Causes" was that it was limited to the "model" being instantiated in an IT system. I see by the comments that I should perhaps interpret "system" a bit more broadly.

Regardless...after a decade or so of kicking around concepts from complexity, philosophy, social psychology, and large IT system development, I've found Dave Snowden's Cynefin framework helpful in thinking about complexity. Here's why: it looks at the intersection of the subject (the knower) and the object (the system, the system context, and the associated data/info).

For IT systems development, that typically means focusing on creating models of a slice of a "frozen" context, a static knower, and a system that connects the user the context slice. For Known/Knowable contexts, that works well. However, for Complex contexts (and an increasingly hyperconnected world is making these more the norm than not), it seems that we may need to expand root causes to address such issues as (a) the context of systems usage (e.g., for Known/Knowable work, or for Complex work), (b) the knower's ability to effectively engage the context, and (c) the ability of the knower's organization to do likewise (where multiple individuals are trying to maintain decision making coherence).

Maybe I'm misreading this, but it seems that there's an assumption of a Known/Knowable context in much of this discussion.

BTW, this point applies equally to the construction of large IT systems, an activity where the importance of shared sensemaking and maintenance of coherent understandings is not always appreciated. The agile movement may be, in part, based in a growing awareness of these challenges.

One viewpoint that I've found helpful is to distinguish 'complexity' and 'complication'. Complexity is an inherent property of a domain. Complication is an emergent property of a system addressing a domain.

Take an overly simple example. We know that sorting takes O(N log N) time - that's the inherent complexity of the problem domain. Attempting to engineer the one sorting algorithm to rule them all, the one that addresses all possible additional requirements someone might have, would lead to immense complication.

In watch making you refer to such artificial degrees of convolution as 'complications' as well.

Complexity is not to be battled or ignored. It is not even a problem, it is a fact of live. Complication on the other hand is a phenomenon of engineering that can be recognized and addressed.

As the saying goes: "There are problems and there are facts of live. Problems we strive to solve. Facts of live we don't solve -- facts of live we have to live with".

As I wrote, this is something that has helped me thinking about such things.

Craig Brown points out that there is "still plenty of complexity in the business..." and that "all of these interfere with business-IT alignment. I agree with this.

Some amount of complexity is inevitable. But much of the complexity we have is avoidable. We should distinguish between these two. I often talk about "unnecessary" complexity as the type that we are trying to eliminate.

In Clemens comment, he distinguishes between "complexity" and "complication". Clemens, by the way, has done much to give us useful approaches to managing complexity with his considerable writing on components.

I don't think I agree with using the term "complication" to describe unnecessary complexity. This word seems to me to imply an afterthought, or something injected into a system. It is closely related to its verb form, "complicate", implying an active introduction. "Complexity", on the other hand, is a died-in-the-wool noun. Complications, for me, are something I want to avoid introducing. Complexity, for me, is something I want to get rid of. At least, that's how I read these two words.

Back to Walter's comments. I have heard about Cynefin, but haven't studied it yet. I'll take a look at it.

You mention an assumption of a "known/knowable context in much of this discussion". My main assumption is not that a given system is or isn't knowable, but that its "knowability" is enhanced once the (unnecessary) complexity is removed.

When you review Dave's definition of known/knowable (Cynefin) and complex (Cynefin), my comment may make more sense. Dave discusses strategies to move a context that is largely complex into a more known/knowable domain. However, his primary focus is on contexts that are inherently complex and therefore require a "probe-sense" approach (vs. a "sense-analyze" approach).

As I noted above, Dave (whose background is physics and philosophy) initially framed Cynefin as a matrix with ontology on one axis and epistemology on the other, with the focus on human sensemaking...which is not exactly the same sort of thing you're addressing. However, I think you'd find Dave's perspective thought-provoking.

Complexity Links

About Me

Roger Sessions is the CTO of Roger Sessions, Inc. and ObjectWatch. He has written seven books and dozens of influential white papers. He is recognized as a Fellow of the International Association of Software Architects. He has spoken at hundreds of conferences around the world. He holds multiple patents in software and Enterprise Architecture. He is the inventor of the SIP methodology, a patented Enterprise Architecture Methodology for minimizing the complexity of large IT systems. Join him on Twitter: @RSessions.