There's no question that having knowledge of your domain is critical if you want to be successful in whatever job you are doing. I used to believe that I was capable enough that I could test anything. Yes, I suffered from hubris when I was younger, but indeed, I believed that all i'd have to do was apply some testing maxims and I'd be able to take my testing knowledge anywhere. In superficial cases, that's true. Domain analysis and equivalence class partitioning are the same everywhere, but how you use them, when you use them and how they will actually inform you of something relevant depends on the domain and your understanding of it. Put more bluntly, I don't care how great a game tester you are, if you want to apply at Genentech and work in their biomedical engineering initiatives, you had better know something about the world of pharmaceuticals, chemistry and biomedical engineering.

That covers the domain, sort of, but what do we do about understanding our competitors? Ahhh, now we can talk about something fun and entertaining, at least, if you do it the right way.

Workshop #94: Round up some of the primary applications in your area of work and expertise. Your own products and those of your competitors (demo or trial software is fine). Create a matrix of applications and then set up criteria you would like to compare. Walk through all of the applications with an eye towards dispassionately seeing how they all stack up against each other (i.e. an application "shoot out"). gather your findings and analyze them. Determine which applications come out on top in the various categories and write up executive summaries. Share these findings with your development/product team.

I'll be frank, Competitive Analysis is fun, if you approach it with the right mindset.

First and foremost, you have to make one simple commitment… set your personal biases, passions, and loyalties at the door. If you want to do competitive analysis, you have to be prepared for the fact that your product may not perform well.

Second, you need to construct workflows that are representative of the way that your company and your competitors products interact with their users. Get creative with this, and see how many of these workflows you can create. Work through ways that you can quantitatively and qualitatively represent the interactions.

Some examples can use benchmarking tools, or can use your automation tool of choice to run the system through hundreds or thousands of looped tests. How did your app do? How did the competitors app do? Can you record the details in such a way that, when you declare the winners/losers that you can do it based just on the numbers, and not on your knowledge of what's been entered? Note: this works best when you have a team that you have given random numbers and those numbers correspond with a platform that you, as the evaluator, don't know about.

Quantitative reviews are fairly simple; if the specs and numbers are faster for one app vs. the other, it'll be there in the numbers. Qualitative reviews are more tricky. How do you rate user experience? What feels good vs. feels not so good? In qualitative reviews, language needs to be precise, and it needs to be consistent. Since I am comparing many different examples, you want to make sure that the level of your review, and the language you are using, is applied fairly. Collect the data from these activities in a database or, if you want to be old school, use a spreadsheet. See my earlier post about dashboards; competitive analysis is a fun place to play around with dashboards, because I am looking at lots of interesting data points that can be fiddled with.

Doing competitive analysis is a great way to do a little bit of sleuthing and play detective with competitors products, but it has a marvelous knock-off effect. Looking at a whole range of products in a similar product space will quickly help you to become a domain expert, not just on the products, but on the business range as well. Back when I worked at Connectix, I did a lot of testing on all of the available virtualization options at the time, and by the time I got through with doing that, I learned a tremendous amount about how virtualization was being used and who was using it for what. The more examples and workflows you compare, and the more products you cover, the deeper the domain knowledge you get to develop. It's cool how that works :).

Bottom Line:

Learning your domain is important,. Learning how your competitors handle your domain is equally important. Going through the process of studying up on your competitors, how they do what they do, and with a critical eye for what and how you can improve testing of your own products/applications will give you an edge over testers who don't do this. Oh, and did I mention that it can be a lot of fun performing these product and workflow shoot-outs? Seriously, it is :).

About

I have, for the better part of my 26 year career, found myself in the role of being the "Army of One" or "The Lone Tester" more times than not. This unique viewpoint, along with ideas and advocacy regarding continuous education and learning/re-learning for testers, is frequently the grist for the mill that is TESTHEAD.

Some additional details for those interested:

I am currently a software tester and release manager for Socialtext in Palo Alto.

I am the producer of the The Testing Show, a podcast sponsored by qualitest and also featuring Matt Heusser, Justin Rohrman, and Perze Ababa, with different special guests and topics every two weeks. I was the producer of, and a regular commentator for, the SoftwareTestPro.com podcast "This Week in Software Testing" (hosted by Matt Heusser, podcast ended its run in February of 2013).

I and a number of other software tester were actively involved in the development of the Software Testing materials being developed for SummerQAmp, a nationwide initiative to train a new workforce in tech skills and create tech jobs for America's youth.