Since the beginning of automated testing, test teams have frequently implemented new tools without first developing a process or strategy that details how they'll make the test tool productive. Don't be that team, says automated testing expert Elfriede Dustin.

From the author of

Today's software managers and developers are being asked to turn around
their products within ever-shrinking schedules and with minimal resources. This
is due to a push from government and commercial industries toward streamlining
the product acquisition lifecycle, as well as the survival necessity within the
commercial software product industry of getting a product to market early. Most
information resource management (IRM) organizations are struggling to handle
this challenge within the constraints of existing information-management
budgets, while simultaneously implementing other competing
information-management initiatives.

In an attempt to do more with less, organizations want to test their software
adequately, but as quickly as possible. Faced with this reality, software
managers and developers have little choice but to introduce automated testing to
their projects. But these software professionals may not know what's
involved in introducing an automated test tool to a software project, and may
not be familiar with the breadth of application that automated test tools have
today.

Manual testing is labor-intensive and error-prone, and doesn't support
the kind of quality checks that are possible through the use of an automated
test tool. Humans make mistakes—especially when mundane, repetitive tasks
are involved. Computers are especially effective in performing these mundane and
repetitious tasks—without the mistakes. The introduction of automated test
tools helps to replace archaic and mundane manual test processes with a more
professional and repeatable automated test environment, one that promotes
test-engineer retention and improves test-engineer morale.

Introducing Automated Testing to an Organization

New technology is often met with skepticism, and software test automation is
no exception. How test teams introduce an automated software test tool on a new
project is nearly as important as the selection of the most appropriate test
tool for the project.A tool is only as good as the process being used to
implement the tool.

Test teams have largely implemented automated test tools on projects without
having a process or strategy in place describing in detail the steps involved in
using the test tool productively. This approach commonly results in the
development of test scripts that are not reusable, meaning that the test script
serves a single test string but cannot be applied to a subsequent release of the
software application. These test scripts need to be re-created repeatedly for
incremental software builds and must be adjusted multiple times to accommodate
minor software changes. This approach increases the testing effort and causes
schedule delays and cost overruns.

Perhaps the most dreaded consequence of an unstructured test program is the
need for extending the period of actual testing. Test efforts that drag out
unexpectedly tend to receive a significant amount of criticism and unwanted
management attention. Unplanned extensions to the test schedule may have several
undesirable consequences to the organization, including loss of product market
share or loss of customer or client confidence and satisfaction with the
product.

The test team also may attempt to implement a test tool too late in the
development lifecycle to adequately accommodate the test team's learning
curve for the test tool. The test team may find that the time lost while
learning to work with the test tool or ramping up on tool features and
capabilities has put the test effort behind schedule. In such situations, the
team may become frustrated with the use of the tool and even abandon it so as to
achieve short-term gains in test progress. The test team may be able to make up
some time and meet an initial test-execution date, but these gains are soon
forfeited during regression testing and subsequent performance of the test.

In the preceding scenarios, the test team may have had the best intentions in
mind, but unfortunately was simply unprepared to exercise the best course of
action. The test engineer didn't have the requisite experience with the
tool or hadn't defined a way of successfully introducing the test tool.
What happens in these cases? The test tool itself usually absorbs most of the
blame for the schedule slip or the poor test performance. In fact, the real
underlying cause for the test failure pertained to the absence of a defined test
process, or, where a test process was defined, failure to adhere to that
process.

The fallout from a bad experience with a test tool on a project can have a
ripple effect throughout an organization. The experience may tarnish the
reputation of the test group. Confidence in the tool by product and project
managers may have been shaken to the point where the test team may have
difficulty obtaining approval for use of a test tool on future efforts.
Likewise, when budget pressures materialize, planned expenditures for test tool
licenses and related tool support may be scratched.

By developing and following a strategy for rolling out an automated test
tool, the test team can avoid having to make major unplanned adjustments
throughout the test process. Such adjustments often prove nerve-wracking for the
entire test team. Likewise, projects that require test engineers to perform
hundreds of mundane tests manually may experience significant turnover of test
personnel.

It's worth the effort to invest adequate time in the analysis and
definition of a suitable test tool introduction process. This process is
essential to the long-term success of a automated test program. Test teams need
to view the introduction of an automated test tool into a new project as a
process, not an event. The test tool needs to complement the process and not the
reverse. In other words, the tool is only as good as the process, meaning that a
process has to be in place before a tool can be brought in.