I've been having to deal with a lot of testing tools lately for testing tasks that should/must be automated. These include things like performance/load testing, webservice testing, vulnerability scanning, etc.

After experimenting with both the open source and paid types of tools out there, as well as they may work, do not seem to lend themselves to modification to make the work of the individual tester/team easier/make more sense for them.

Does anyone have any experience in building any of these tools yourselves in house? If so, was the pay off worth it? Many of us are willing to work in our off-work hours in exchange for making our work easier/less of a hassle. I can see this being of use in some specific circumstances, such as load testing and web service testing as it would be easier to tie into internal resources.

11 Answers
11

Yes. Over the last few years I have built, re-built and evolved testing tools for a number of things:

A Windows autumation library on top of UIAutomation

A full C# based testing stack built on top of Watin and the UIAutomation library

Control generators

Test case management systems

Defect tracking systems

Various intergraion tools to work with TFS, JIRA or Quality Center as required

Most of this has been opensourced (testingstax.codeplex.com. What we found is that whilst the overall technical experience we required was higher, as we now can control our own destiny we can really solve our problems, as opposed to try and adapt someone else's solution.

We also found that supporting ourselves, actually was better as after having a vendor ship a developer onsite to solve problems with their automation tool, they still couldn't solve the problems with their propriatry, closed source solution.

With the open source / in house combo, you actually have access to a much wider pool of talent, and if you have techical challenges or need, a great motivator to solve them.

What NEVER worked for us though was trying to build a great tool to do X if there wasn't a pressing need to actually use it "in anger". Building something that would be cool, or a nice to have, never got finished.

The other key point that I like to say is that when it comes to this type of thing it is not about the tools, but about the people. If you don't have the people with skills to develop and maintain the tools, then don't even attempt to build them.

"What NEVER worked for us though was trying to build a great tool to do X if there wasn't a pressing need to actually use it "in anger"." Agreed! Also, it's important to make sure that the decision to build something in-house is a conscious decision, with all the support implications considered as well. Usually that decision wouldn't be left to an individual tester, but rather considered in the context of the overall QA Strategy.
–
Joe StrazzereMay 16 '11 at 15:51

Bruce, did you end up moving the website where your tools were opensourced? Looks like the link is no longer available.
–
Chris KenstSep 19 '12 at 23:09

yeah I took it down ... the code is still on Codeplex at testingstax.codeplex.com or in github under teknologika.
–
Bruce McLeod♦Sep 21 '12 at 3:59

I have done quite a few in-house tools built in different languages and platforms, and while I agree with everything that has been said before but one point I think is missing is determining not only that you need the tool but to support it. Either you need to be able to schedule maintenance on the tool, or have a toolsmith to work on it, without that you get a tool that meets need X but it never progresses and it doesn't get updated. Also, it needs documentation since that tool may not only be in place and be used now, but in a few years someone new may need to work on it and you want to be sure that the future person has an idea of what the tool was designed for or made to do. Commenting the code is good.

Also don't forget to test the tool, I've found a few instances where the tool worked great for X but caused issues with Y and failed utterly with Z, oftentimes when developing in-house tools the testing is not as complete as it could be.

I really like your point about maintainability. With many other QA's/Testers in our group, I don't think many would be keen on continueing development of the tools after they are first built and would quickly fall to the wayside without an absolute need to.
–
Lyndon VroomanMay 6 '11 at 0:17

Where I work, almost all of our tools are built in-house. Some have been extremely helpful, while others have been a big time sink.

However, IME, if you take the time to prioritize what you're going to build based on the biggest problems you want to solve first, then develop the tools with the same care you put into developing customer-facing apps - and develop iteratively so you know if you're heading in the right direction, AND communicate with product planning to make sure whatever you build won't be obsolete in the next version, you have a pretty good chance of getting positive ROI.

The team I work in uses a combination of open source software and home grown software, but a staggering majority of it is home grown.
We use available solutions for continuous integration and the running of automated tests, but the rest of the framework around this has been built from scratch; the needs of each of the technologies we are responsible for testing are too specific for any general tool to properly support. We rely on external solutions for very specific tasks they perform well for, then build the systems we need to be general within the scope of how our products work.
The payoff has been very high; the code base is interesting and the framework has been reliably serving us for years and can be extended to support new projects with fairly little difficulty.

Before you build anything, I would take a look around the Open Source world again. It doesn't matter if it's Java, Python, Ruby or C# there are tools. For example you mention security scanning ... have you thought about running Nikto. For performance testing there's Siege or even Apache Bench.

When it comes to modification, I think OSS is the way to go depending on the licensing since the source is available for use.

We have had success in adding utility methods to existing opes source tools which could be used across org. We heavily use Selenium for front end automated testing. utility methods which have been built deal with - externalization of test data, page objects, updates in test management tool etc. These util methods have off shoulder burden from testers who otherwise would be have to write their own libraries for development.

In the team I work in, we've seen a lot of value to developing in-house tools, for example:

A website front end to our QTP automated so that QA team members don't really need to be QTP experts to run the automated tests.

A tool which can configure, and report on the configuration of a QA server.

A tool to validate build deployments.

In my opinion, if the task is repetitive and slow, the payoff is definetly with. It's worth trying to convince management to allow some team members to allocate at least one day a week to the development of such tools, assuming of course, the team members have the necessery coding skills.

Where I worked most recently I built fairly specific tools or automation frameworks for the applications I was responsible for testing. I simply found it easier to do that based on the applications I was testing. In this case my environment was data driven C# winforms and windows services applications. It was faster and easier for me to build an app/task specific tool for what I was doing. If I had been doing more web work I may have looked more heavily into Open source tools like Watin or Selenium and adapted them to what my needs are. It really depends a great deal on what you are trying to accomplish.

One thing I like about my own home grown tools is I am aware of exactly what they are for and what the strengths and limitations are. Over time you can amass a nice library of code that you can utilize across different projects to help you produce tools more quickly.

Yes, but I typically will try to find an open source tool that does most of the heavy lifting and add my customizations to it. This way, you are able to add value quickly without being bogged done developing infrastructural stuff yet again.

Example: We needed to build a test framework to test an HL7 (Healthcare) Information Appliance (take HL7 data in one form, map it to a different form and version), loads of rules and data, so an excellent candidate for automation. I found an opensource tool called Mirth, http://www.mirthcorp.com/products/mirth-connect which is rather popular in the healthcare world, it has all the HL7 goodies, validators, connectors to files, database, web services and scripting facilities in short all the infrastructure you would ever need.

We built our test framework on top of that and we were adding value (finding defects) within scant few days. Now almost 3 years later this solution is still very much alive, running all the regression tests (100K) on a very frequent basis and now is completely integrated in the tool chain.

One mistake that I've seen companies make when they perform "build vs. buy" decisions is that they tend to underestimate the amount of effort it will take to maintain and continuously improve tools. If you're creating a simple quick tool that you and your team are confident won't grow in scope over time this "cautionary tale" might not apply to you, but this issue has bitten many companies in the butt (like Tangurena's, perhaps?).

Before getting too confident in the accuracy of a prediction that "we can build something better than tool XYZ in X hours that will fit our needs better," it is worth checking out tool XYZ's recent updates. Doing so will:

Give you some sense of how likely it will be that the "missing features" you have identified in the tool XYZ will be added to it any time soon.

Serve as a reality check on your initial calculations about how much effort will be required to create your own tool.

For what it is worth, to provide a concrete example of a summary of recent updates that we've made to our testing tool, you can check out this list (which is partially shown below). I seriously doubt that even IT groups backed by Fortune 100 budgets would be continuing to make improvements, bug fixes, changes, etc. on such a frequent basis for testing tools that were built in house.

To be clear, I'm not trying to say buying a tool is always going to be the answer. Many times, getting an open source tool or building your own tool will clearly be a better option for you. I just wanted to highlight a risk I've seen; companies have a tendency towards underestimating the ongoing maintenance and improvement costs for in house tools (and perhaps overestimate the likelihood that in house resources 5 years in the future will have an interest in improving it).

PS - For anyone who is interested, we decided to open source the code to that "Recent Updates" feature shown above. It's on GitHub.

The following is based on my experience testing and developing in-house tests for Embedded systems, the answer for "pure" software systems might be different:
You have to distinguish the framework for executing tests from the "drivers" and the tests themselves, the later allways being in-house.
The framework to execute tests and collect results can be as simple as x-unit (Java or CPP) other unit test framework (Perl has more than one) or an in-house tool. Many times the benefits from developing one in-house are not major.
OTOH The "drivers" layer (the common, usually low level, part for all tests) will greatly benefit from being in-house. For example, I evaluated a test suite based on JUnit that included such drivers for common test and measurement equipment- the drivers were very generic, and as such were inefficient and difficult to work with.