The Unintended Dangers of Results-Oriented Software Testing

The Unintended Dangers of Results-Oriented Software Testing

We’re happy to announce that the online version of Testuff now comes with an overview screen. Because this dashboard already exists in the desktop client of our software testing suite, you might be asking – “So what’s the big deal?”

Well, first and foremost, the dashboard creates a more unified experience across both platforms:

Some testers prefer the desktop interface.

Others use the online version exclusively.

And still others switch back and forth, depending on the project.

In all cases, the interface will be more consistent, which is central to the “anytime-anywhere” philosophy with which we design our software testing tools.

But there’s an even more important reason why this online dashboard matters.

Testuff Web Users Now Benefit from Summarized Data

The online version of Testuff now provides users with a top-level view of how all the different components interact, including the:

Tests

Defects

Requirements

Executions

Testers

So again, “What’s the big deal?”

Avoiding the Hidden Perils of Measured Parameters

Too often, we get bogged down in measured parameters and how they operate in isolation. If something isn’t working correctly, we naturally hone in on that particular defect or bug.

But successful software testing requires a holistic view in which we study multiple data points from different angles. And we need to understand how individual results emerge based on:

What test was run.

Who executed the test.

When the test took place.

What known defects existed.

What surprise defects emerged.

What the requirements were.

When we focus all of our attention on the parts (instead of the whole) bad things can happen.

If you don’t believe us, just ask Mitsubishi, a Fortune 500 company that was recently caught lying about its fuel economy – a “measured parameter.”

Note that it wasn’t the company that lied. The massaged numbers came from just one department. The team wanted to look good and focused all of its resources on:

Improving the fuel economy (if it could).

Or

Improving people’s perception of the fuel economy (if it must).

This happens all the time – not always as a result of deliberate dishonesty. To the hammer, everything looks like a nail. And it’s easy to become so obsessed with an outcome that you miss the forest for the trees.

This is as true in the software world as it is in the automotive world. But in Mitsubishi’s case, the results were especially damaging. The fuel economy team’s desire to “look good” infected the entire product. And Mitsubishi’s reputation ultimately suffered.

Had the company focused on the product as a whole, this disaster could have been avoided. Tests should have been run from multiple angles with much more detailed pictures of how the components worked together. This would have allowed the team to catch defects sooner and take the appropriate corrective actions.

And this is the value of the new Testuff Web dashboard.

It provides a holistic overview of each project’s:

Progress & status

Needs & changes

Challenges & opportunities

This dashboard doesn’t diminish the importance of the individual components. In fact, having a “micro” view is just as critical as the “macro.” And the dashboard interface provides links to let you dig deeper into each data point. But if the nitty-gritty details drive all of your efforts, you’ll end up missing the bigger picture.

And it’ll become harder to make informed, top-level decisions about what should happen next.

If you’re a Testuff Web user, we hope you enjoy this new dashboard. Feel free to share your experiences in the comments below.