Visualize First. Build Later.

The essential role of communication among all parties involved in a software development project is clear. Equally clear: the power of graphical models and similar visualizations to enhance communication. More problematic are questions of deciding what kinds of visualizations should be used, at what point in the development process are they most useful, and do automated visualization tools add value?

A a recent article in Computerworld, Visualize First, build later: the advantage of simulation tools, by Esther Shein advocates comprehensive use of an automated visualization tool early in the development process. This position is consistent with traditional, waterfall-like, development methods. A recent webinar Leveraging Requirements Visualization – A Biotechnology Company’s Agile Case Study, offered by Susanna Goldenstein, suggests that the use of a visualization tool, but using a non-traditional (agile) method, is also highly effective. Both the article and the webinar talk about the same automated tool, iRise, described as "an enterprise visualization platform used to quickly assemble working previews of business software that mimic the exact look, feel and behavior of the final product."

The Shein article suggests that the use of automated tools that visualize user interfaces, model business processes, and simulate user interactions are held to be superior to more common tools, like spreadsheets, that capture long lists of system requirements in primarily textual form. Goldenstein's webinar implies that visualization tools are a necessary enhancement of, or replacement for, story cards as a primary agile communications tool.

Shein quotes Melinda-Carol Ballou, an analyst at IDC in Framingham, Mass on the importance of good requirements, particularly in today's economic climate.

The challenge that business stakeholders face in trying to communicate their requirements to software developers isn't a new problem. But dealing with that challenge has become more critical in the wake of the economic downturn. With increasingly scarce resources, there isn't any margin for error. So if there's a disconnect between what you create and what the business actually needed, the costs of failure are more pronounced. When you're visualizing requirements and looking at a screen, it gives users something tangible" so they can see what "something means in physical terms.

Being able to capture better requirements should, Shein suggests, also address the well known problems of "challenged" and "failed" projects.

The Standish Group reports that there was a decrease in project success rates from 2006 to 2008. ... In 2008, about 44% of projects were late, over budget and/or came in without all of the required features and functions [challenged], and 24% failed and were canceled prior to completion or delivered and never used, according to Standish. In 2006, the failure rate was at 19%.

If requirements visualization tools can improve this situation, their value is obvious. To what degree does this kind of tool actually help? A lot of people believe that such tools are essential. iRise is but one example in an increasingly crowded market. Some alternatives mentioned in the Computerworld article include:

As Shein noted, the capture of complete, accurate, and unambiguous system requirements is not a new problem. Fred Brooks suggested (in, "No Silver Bullet: Essence and accident in software engineering") that the lack of a conceptual construct (a mental or visual representation) of software was THE essential difficulty.

The use of automated tools to capture requirements has an equally long history. Dan Bricklin (co-inventor of Visicalc, the first commercial spreadsheet and often credited as the 'killer app' that sparked the desktop computer revolution and Apple's early success) created Demo a "rapid prototyping tool" that allowed graphical depiction of user interfaces and simulated the basics of interaction with those interfaces - in the 1970s. The value of using such tools to improve communication and to increase customer satisfaction with software interfaces was clearly demonstrated and Demo was soon followed by numerous similar products. Two problems slowed the adoption of this kind of tool in the decades that followed: first, the idea of prototyping was very controversial, with software engineering proponents deriding it as dangerous and error prone; and second, user expectations were faulty - seeing the complete prototype, they tended to think the application was essentially done!

The market for requirements visualization tools is seen as a growing one; increasing from the $194 million reported in 2007 to a projected $290 million by 2013. This is not explosive growth and it remains to be seen if the more sophisticated products available today can deliver on what their predecessors were not.

The article states "Microsoft Corp.'s Visual Studio 2010 Ultimate includes Expression a tool providing an informal way to iterate and prototype user interfaces".

This is incorrect. Microsoft Visual Studio Ultimate includes a number of architecture tools which can be used to visualise requirements using DGML and UML. These tools can also be extended further using the VS SDK. These tools are aimed at the senior developers and architects.

Microsoft Expression SketchFlow is a completely separate product and provides a way to visualise and story board interfaces for applications. These tools are aimed at the UI designers.

Why would you invest so much time and effort into creating a prototype or mockup when you could be building the real application in an iterative manner? Aren't these tools just promoting Big Design Up Front, which has been soundly discredited as an approach to building systems?

Mockups are an extremely effective tool in bridging the gap between expectations and outcome. "Big Design Up Front" failed because mounds of pages with words and sentences describe the outcome, whereas high fidelity mockups are the outcome.

Mockups are especially effective when a lot of development work is involved. If up front you can eliminate some iterations with a massive team of product managers, usability experts, designers, front end developers and software engineers, you will save time and money. If you need a website with 3 pages, mockups may or may not be effective.

So, whether mockups are "Big Design Up Front" done right, or part of the iterative process, what is certain is that many developers are finding it useful. iRise is succeeding, and new mockup tools are springing up like mushrooms - Balsamiq, Mockflow, OmniGraffle, ScreenSketcher, Mockingbird, HotGloo etc. They have a market, and apparently many of them are agilists too.

Yes, I use Balsamiq as well. I love it as a very low cost, low fidelity tool to mock up pages and screens. I also used Dan Bricklin's DEMO back in my DOS days in the early 90's. I also agree that mockups are a very useful way to collaborate - that has been my experience for my 20+ years in this business.

However, saying "iRise is succeeding" doesn't sound right to me. No tool succeeds, the people using the tool do. I've been at clients where a UI consultant created the whole interface using Balsamiq and then left at the end of the contract. The people were afraid to make obvious changes because they didn't know the tool. Was that Balsamiq's fault? No. It was the fault of the UI designer and the people who hired that person, because they lost sight of the fact that you need to collaborate when creating a system.

I've been reading recently about the move to a more iterative, incremental process in the UI/UX world. People there are talking about focusing more (but not completely) on the UX in early iterations, then moving that focus more to the system (also not completely) as the project progresses. That makes more sense, and is what we do anyway! :)

I'll end with a question. When you're designing an entire system up front, how do you know when you're 100% done?

I would say Mockups are a very useful compromise between the advantages of BDUF (eliminating what what you don't want quickly) and pure iterative (discovering the details of what you do want when they're needed).

In my experience, iterative is great for the backend code, and not so great for the front end code--at least not the design of the front. Implementing the front end code based on a mocked up design, then iterative is great again.

Most people need to see how the "whole" looks and feels before they'll commit to accepting the pieces. Designing the bulk of the UI for style, direction, and buy-in saves a LOT of time. Then the code to actually produce that can be implemented in iterations, and the product owners are comfortable seeing pieces because they know where it will end up.

Designing the bulk of the UI for style, direction, and buy-in saves a LOT of time.

Do you have any proof of that? How much time? How do you know what "the bulk" is? How do you know you're done?

I'm not saying that you shouldn't design some of the UI up front. I am saying, though, that designing ALL of the UI is going to incur some speculative work. I believe (and have experienced) that you don't need to design an entire UI to get an idea of the general flow and feel. This is something that can be done collaboratively at whiteboards with low-fi tools. If a stakeholder needs to see something more "real", then so be it. That stakeholder should also be aware of the cost of doing so.

It also shouldn't be difficult to change the application's navigational structure when you learn that you need to do it. If it is, then you haven't built the system very well. That applies to work that I've done in web applications, Java rich clients, PowerBuilder rich clients, C/C++ applications, Clipper apps... you get the picture.

"When you're designing an entire system up front, how do you know when you're 100% done"You don't. what you have is a lower probability of failure caused by a total lack of understanding of the system. Nailing down requirments up front does work provided you don't assume you know everything at that point. You factor in risk costs for future change and you include a change process (as light as you can but no lighter).

Currently most organisations are nto capable of handling Agile and and where they cannot do so, getting mockups, prototypes and other rerequirements gatheirng techniques does reduce project risk.

Poorly run Agile projects in immature IT organisationsn and poory led businesses has just as great failure rate as traditional so-called waterfall approaches.

If you investing 'so much time and effort' you are doing it wrong. It only requires a fairly small amount of time rlative to the whole project, but it does help the business visualise what they want and thus redice error rates later in the project. Even in an agile world mockups can be useful, if done as lightly as possible, but no lighter.

Cameron, this is not intended to be an ad - hopefully the rewrite makes it clear that it was simply circumstantial that that both sources cited one product as their exemplar. Out intent here was to point out the resurgence of interest in comprehensive visualization tools.dw

Dave has a point, and so does Shimon. You can have both -- they are not mutually exclusive, although in these tools, they are.

Dave's point goes to what Bertrand Meyer termed as "seamlessness", a quality of development artifacts that allowed them to be refined or consumed downstream in the development process.

What we need in this space is a tool that not only "prototypes", but also produces development resources. This sound hard, but isn't really as tough as it sound when you truly understand MVC.

Innate in MVC are the concepts of a Domain Model and a View Model (App Model). Users think in terms of their application they want to build, screens, reports, etc. Each screen can be thought of as containing a "screen model" that back it up, separating presentation from data. Said another way, you can design a screen that contains dynamic content, and as long as I hand you a data structure that contains that dynamic content via XML, JSON, BlazeDS -- whatever -- you are good to go. The data structure becomes the "screen model" -- the only data the screen knows about (ala Rails, although Rails got it from countless previous implementations). In more elegant implementations, you can think of the screen model (either page based, RIA, or Desktop) as a data "cache" or a distributed database, but that is another talk we can have. :-)

So, with users focused around describing the View (App) Model, the UI, pageflow/workflow/bizProcess ("flow"), and reports, in MVC terms, that leave the work of the Controller to "Map" or marshall data from the Domain Model and create transform it into the View Model, and then take changes from the View Model and transform it into appropriate calls to the Domain Model. This is the essence of SOA and ESB (with lots of technology choices added :-)

This controller that performs this mapping can also be automated, as proven by several approaches -- data binding (automagic data synchronization), an approach similar to Dozer, en.wikipedia.org/wiki/Service_Data_Objects, etc.

With this "separation of concerns", it is then possible to build a tool that not only allows robust application prototyping, but actually produces the UI, flow, and View Model as development artifact, leaving the jobs of "mapping" the application into the Domain Model, which can also be made much easier through tooling.

Is your profile up-to-date? Please take a moment to review and update.

Email Address

Note: If updating/changing your email, a validation request will be sent

Company name:

Keep current company name

Update Company name to:

Company role:

Keep current company role

Update company role to:

Company size:

Keep current company Size

Update company size to:

Country/Zone:

Keep current country/zone

Update country/zone to:

State/Province/Region:

Keep current state/province/region

Update state/province/region to:

Subscribe to our newsletter?

Subscribe to our industry email notices?

You will be sent an email to validate the new email address. This pop-up will close itself in a few moments.

We notice you're using an ad blocker

We understand why you use ad blockers. However to keep InfoQ free we need your support. InfoQ will not provide your data to third parties without individual opt-in consent. We only work with advertisers relevant to our readers. Please consider whitelisting us.