Performance testing is something that I, as a developer do when optimizing. However, if the change is purely optimization, should the QA or business user perform their own performance testing? I would reckon they should, aside from testing that the functionality remains the same. If that's the case, is it acceptable for them to test via manually timing the response time from the screen?

2 Answers
2

Any change you make should be as a result of a requirement from the customer. This could be an initial requirement or an enhancement/bug report made when then they find that the application can't handle 10,000,000 records (or whatever).

As such any change should go through QA with the pass criteria being a) that it works and b) that it works in the required timescale.

If performance is a high priority then I would expect the customer to do their own testing as well either prior to initial acceptance or as soon as any enhancement/bug fix is deployed.

Now, my company generally works with large companines producing B2B software so the client almost always does their own test pass after we deliver something to them.

What they test is up to them, but if there are performance requirements (or performance change requests) then we must do perf testing to know that we actually met those requirements. Often the client does as well (though I've seen them just run through a scenario and say 'fast enough' - which it was).

So the straight forward answer is: QA always, and the client can decide if they want to as well.