Interestingly enough, even with a small number of responses (30), results confirm what we’ve been seeing discussed here on this site and at others regarding problems in testing today. Turns out Manual Testing tops the list of software challenges with over 70 percent of respondents saying it’s an issue. Lack of Test Coverage is a close second at 50 percent.

In a blog post on the results, the Grid-Tools team describes the benefits of using Test Data Management or Test Case Design solutions to help overcome these challenges. We believe that Service Virtualization can also be used to overcome these concerns.

For development activities in the application economy today, most of the data needed exists in systems that are out of scope and may not be under your control. So rather than try to extract data directly from these sources, you could use Service Virtualization to capture and simulate the behavior of out-of-scope systems by responding with just enough appropriate data and dynamic behavior to “fool” your system under development into believing it is talking to the real thing. I think of this as TDM Light, especially important for use in the earlier stages of your development lifecycle. What the blog proposes is the use of more advanced TDM tools in what I like to call Advanced TDM.

In a just released Service Virtualization Case Study, Nordstrom, the fashion specialty retailer, outlines how they’ve used Service Virtualization to reduce performance testing time from three months to four days and reduced test data set up effort by 75 percent.

Do you think there is a scale of Test Data Management use where Service Virtualization can be used as an introductory TDM tool? Or should Test Data Management and Service Virtualization be considered separately in their own right? Let us know in the comments and get the discussion started.