If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Tool evaluation

Hi all,

The company I work for have decided (after much pressure) that there is a need for an automation test tool.
I have identified the requirements and we are currently creating a POC (proof of concept) which the vendors will demonstrate that their tool satisfies.
We will obviously look at the market leaders and some of the smaller companies. The company recently hired a test tool automation analyst whom previously worked with QTP.
The question I wish to ask, what weighting should be given for training. I feel that, it is very important; even though the company will look to the expert we just hired to do most of the work. Also, is it better to do a formal (possibly certified, if such exists) training course with either the vendor or an independent training company? I’m of the opinion to do a formal course, thus the user will then have the documentation etc.

Re: Tool evaluation

The real answer to your question can only come from your organization. But I suspect that it will have a goodly amount of weight based on only having one analyst. He is going to have to be all things to all people - building scripts, building tests, training people, etc. He won't have time to do everything he will be asked to do.

A question for you to consider. Since the analyst has previously worked for QTP, does that mean there is a built in bias toward that product?

Re: Tool evaluation

The answer will lie with the company and will be based, I feel, on cost.
I will be assigned to help the expert, but from my experience here, I’ll be dragged off to work on other higher priority projects.
The increased weighting to training, will hopefully lead me to support the expert as much as I can.
I checked the proof of concept, and it’s not written that the tool should fit the expert’s experience. The only reference is of a “low weight”; where the scripting language fits any scripting languages the test team has experience off. The primary aim of the tool will be to support the main applications we develop here, not the tool we find easiest to use.
The reason why I want other views is that we already have purchased a performance test tool in the past. The training which we received was in the form of consultation (not cheap). There were no formal documents and the emphasis was on scripting and less on analysis of results or how we should strategise. We then had to document everything. I defiantly think that this was the wrong way to go.

Re: Tool evaluation

To answer your question about training that is something we find is often overlooked when consultants are involved in the selection process as they will want you to retain them as a bulk of the major vendors outsource the training to partner-consultants.

If a tool is difficult or complex to use then there will be more training required, and also a lot of questions of forums like this. (I would suspect that a number of the vendors you are selecting from will have questions about their product on here).