Menu

Thanks Very Much for Your Opinion, Unfortunately It’s Useless to Me

As a product manager for a technology company, people come to me all the time with suggestions for things we should do. “We should build X new feature!” “We should change Y process!” “We should pay more attention to Z customer group!” Sometimes it feels like everyone has an opinion for how we can improve our bottom line.

The problem is, often those opinions are presented with little data or factual support.

Over time, I’ve noticed a few different common types of these “opinion based” requests.

1) The “over index on a single use case or incident” requestThis kind of request is extremely common. It’s usually triggered when something bad happens – we lose a customer, we lose a new business pitch, one of our customers loses money, etc. Any time something bad happens it’s tempting to over index on the specific case that caused the loss. It’s human nature to overreact and over adjust – but if you’re not careful, you could put more focus on the incident than it otherwise deserves.

2) The “This is what our competitors are doing, therefore we should do it too”Another common fallacy. Our competitors are doing XYZ – and they’re “crushing us,” therefore we should do what they’re doing too.

The problem with this request is that people almost always overestimate the market success of competitors. It’s similar to the “grass is always greener” complex. Competitors have the same problems with strategy, organization, and execution that your company does. In all likelihood, the same competitors you’re looking at, are looking back at your company and talking about how they can copy your practices.

3) The “We suck at this”/”We always mess this up” – therefore we should invest more in itThis request is trickier. This is the desire to do an activity well (e.g. user testing or events) that’s not typically done well by the company. The problem with this type of request is that there is probably a reason (either obvious or hidden) for why your company is bad at that activity. It’s not always possible to fix systematic problems overnight – and sometimes focusing on a problem at the wrong level (e.g. trying to fix a strategy problem at a tactical level) can be counterproductive.

There are more…

Whenever folks come to me with these kinds of suggestions I always want to say the same thing, “Thank you very much for your opinion, unfortunately it’s useless to me.”

Of course, this response is much more flippant than I would ever present in person, but to a degree it’s the truth.

Opinions often are useless, what I really need is data.

If data is not immediately apparent (as is often the case), it’s important to ask the question “What data would we need to make a better decision about this.”

Every single decision can be made correctly if given the right data.

Sometimes data will be missing and assumptions need to be made. Assumptions are dangerous and you need to keep a close eye on them. It’s important list out and “prove” or “disprove” assumptions as quickly as possible (or risk continuing on the wrong course).

Still – assumptions should not be based on opinions either. There is data and data-supported assumptions – that’s it.

Saying, “This is what I think we should do” is easy. The hard work is collecting the data to make the right decision, analyzing the data, and defining/refining use cases. Doing this kind of work can be extremely tedious. It’s often very time consuming and not very fun. But if done correctly you will make better decisions and have a better strategy.

when I was a strategy consultant, our job used to be telling c-level people ‘you should do x’. our typical ppt deck would be an exec summary page with 6 or so bullets, and then 30 pages of data and analyses backing up our assertion. 95% of my job was building the analyses to prove our hypotheses (or leading a team to do so).

love the blogpost, and had a suggestion for a followup. you should write about how to support your argument with data! particularly when the data to get is not super obvious.

Thanks Dr. Truxal; great point re: supporting argument with sparse data. Maybe you can teach me how strategy consultants do this? Would love to learn.

Alex Cone

Perhaps just as crucial as the data is starting with a problem as opposed to a solution. I always ask people who bring me product requests, “what problem are you solving and for who?” I look at feature requests as a signal that their may be an underlying problem. Now whether or not we should solve is another question all together. That’s where data and company strategy come in to play.

Another classic PM problem. I was last week I was talking to a technical solutions consultant who recently made the transition into product management. He said that when he used to work in services he thought product management was just making a long list of features to prioritize. Now he realizes that if you do that- you’ll never finish the list and never really get anything done.

Eric Picard

The wrong takeaway from this article is to push away people coming to you with symptoms of a problem for your review. Usually the example types above are really just symptoms to an underlying problem.
Symptoms are useful, but only as far as addressing root cause. I think of product work in many ways as carving away the symptoms that are pointing the way to the root cause of the issue.

There is still the axiom of the customer always being right (even when they’re wrong.) They’re generally wrong when they talk about the problem they have with the software through the lens of a suggested solution. I always listen (occasionally there are good ideas here), but often ask many questions trying to uncover the root issue.

Also – expecting customers (internal or external) to show up with data is a bit unrealistic. All they can tell you is their experience. It’s product’s job to dig into the data – and to weed through the edge case requests to understand the core issues.

Thanks for the note here Eric – funny, about 10 minutes before you posted this here, John Shomaker posted a very similar message on the LinkedIn update where I originally linked to this post.

Reposting my response from the LinkedIn post – because i feel that it’s relevant here as well. But I will say that both you and John make a good point and it would be a huge mistake as any PM to universally drive feedback away.

“One of the reasons I wrote the post was to give non-product folks inside technology organizations an idea of what the PM does with their feedback (basically the PM has to sift data out of opinions). My thought was if more people recognized what the PM was trying to do it could compel stakeholders to do a better job of supporting their anecdotes with facts. Sort of – meeting the PM halfway. Ultimately it depends on the complexity of the product, the level of staffing of the PM org, and the maturity of the product, etc. – but i have observed cases in highly complex product environments where the overwhelming volume of non-data supported anecdote can overwhelm the PM org and have a negative impact on the long term roadmap. Of course, in environments where the factors are different (less complex product, more PM staffing, etc.) the balance here can look different (as you point out).”

In some ways this is the problem with writing about product management – it’s not really a one-size-fits-all skillset or process, but depends heavily on company level and industry level variables. That being said – this is one of my favorite things to debate – so thanks very much for reading and commenting!

Hi Andrew, as some commenters have already observed, requesting data right away might come off poorly and lead to waste.

A modification to consider might be asking questions first to drill down to the real issue. Once you’ve understood the problem at the root of the request, then you can connect it back to the most important business challenges you’re trying to solve, or initiatives you’re trying to push. In this way you help the requester understand the problem, and its relative importance to achieving your overall strategy.

Once you’re pointed at the right problem, data can help you validate and quantify.