In Part 1, I referred to analyzing user reviews of competitor SaaS products as "customer development for pre-launch startups, the chronically lazy, and the painfully shy."

In other words: it's the kind of customer research you can do when, for whatever reason, you can't actually talk to real customers.

While not the ideal way to get to know your target market (the ideal way being actually asking them questions and listening to what they say, no matter how awkward you might feel doing it), analyzing competitor reviews is a pretty awesome way to stress-test company and/or HiPPO-driven assumptions about what your future users will want and expect from your product before you've got active users you can ask.

That being said, there are some important caveats to keep in mind when analyzing product reviews, otherwise you may end up gathering a whole lot of noise and not a lot of insight. Here's the system I follow to get information I can actually use:

Have a proper system for documenting & quantifying
You know how people say "pics or it didn't happen"? Same rule applies with customer research. Reading a bunch of reviews and then just mentally tallying up a few general takeaways to bring up at your next team meeting is NOT customer research. That's just wasting everyone's time.

Embrace forms & spreadsheets — yes, they're boring, but they're also shareable, team-editable, and free. To help me parse meaningful information from reviews and other sources of voice-of-customer (VOC) data, I use a dedicated Google Form + worksheet where I collect the whole raw quote form the user, then in subsequent questions pick apart the specific Loves, Hates, Needs, Pain Points, and Motivations that customers mention in said quote. FYI, doing data entry by Google Form (rather than just in a spreadsheet) is a great way to keep qualitative data relatively clean, typo-free, and sortable down the road.

Then, after collecting 100+ or so excerpts, I can actually quantify/graph which Loves/Hates/Needs/etc. get brought up most often fairly easily. I know, it's tedious, but it works.

Focus on needs & pain points, not features
This is perhaps the most important thing to keep in mind as you analyze product reviews. Needs and pain points help illustrate real-life use-cases and scenarios, which will help you shape your value proposition and core solution. Features don't.

Remember: what you're trying to get here are insights into the key pain points and problems your customer is trying to solve, NOT take a vote count on the line items of a feature spec list. So focus on lifting quotes in which customers explain the user needs & pain points driving their opinions.

"I hate feature X, it's so badly designed." <------- not a great quote. Tells you pretty much nothing.

"I hate feature X .... there's no way to export my data into CSV, so I have to copy and paste every individual entry into Quickbooks manually and it takes FOREVER!" <------ solid quote, helps paint a picture of the customer story that you can use.

Look (and even search) for "dealbreakers"
One of the coolest consequences about software review websites is that they attract prospects who are in the process of choosing which product/brand to buy and looking for advice. For relatively well-known competitor products, you'll often find comment threads within reviews where prospects will actually state what their key "dealbreaker" requirement is and why, and then ask existing customers whether or not product X meets that requirement.

This is huge ... you're not only getting a glimpse of the prospect's use case, you're getting a glimpse of their purchasing thought process! Pure gold.

(To make things even easier, you can even do a CTRL-F for the words "dealbreaker" and "deal breaker" to zero in on these threads —that's how often people say things like "Does product X have feature Y? I have to be able to do Z, so if not, that's a dealbreaker for me.")

Ignore angry rants
By and large, angry, frustrated people have way more to say on a subject than content, satisfied people, which can distort the feedback pool quite a bit. Moreover, people don't usually get rant-level angry about product design issues, they get rant-level angry about being ignored, neglected, and taken advantage of.

If you blindly accept angry feedback into your review analysis, you will likely end up with "responsive customer support" and "great refund policy" as your top-line customer needs every single time*. So do your best to filter out rage-y reviews and again, stay focused on use-case scenario-related needs & pain points.

Quick side note:While you should probably omit most angry rants from your MVP user research, it goes without saying that said angry rants should be reviewed by whoever will be managing the business-administration and customer service side of things, to gather intel on what can go wrong post-purchase and how to avoid it.

Validate with surveys/user testing/interviews as soon as possible
Remember, gathering a wackload of competitor reviews and quantifying which user needs & pain points are most common is a fallback tactic you can resort to when you don't have users you can ask directly. Tracking down users to talk to should still be your team's top priority.

After all, a systematic, constantly-flowing customer feedback loop is the engine that powers all optimization endeavors in the end - not just of conversions, but of product design, marketing, and customer service, so you might as well get it set up NOW.