Online Reviews Are Biased. Here’s How to Fix Them

Executive Summary

Research shows many of today’s most popular online review platforms — including Yelp business reviews, and Amazon product reviews — show that the distribution of opinion is highly polarized, with many extreme positive and negative reviews, and few moderate opinions. A new study shows that simple incentives can be used to provide more accurate online reviews. Correcting bias in online reviews can sway important decision in the economy. Imagine a job seeker deciding between a similar job in two industries, such as consulting versus advertising. The research shows that the ranking of industries in terms of online ratings often flips based on whether reviews are left voluntarily or in response to an incentive.

ryccio/Getty Images

In the age of the internet, reputations are almost never a blank slate. Consumers are surrounded by online reviews thanks to other consumers who’ve gone to the trouble of posting opinions about products and services online.

But online reviews are a dual-edged sword. On the one hand, they’re a blessing if they help consumers to make more informed decisions. On the other hand, there is a systematic problem with many online reviews — they tend to over-represent the most extreme views.

To see why, consider the last time you purchased a product. Perhaps you were asked to provide a review afterward. Did you do it? If so, our research suggests you most likely really loved the product, or absolutely hated it. If instead you had a moderate view, you’re likely to have left no review at all, finding it not worth the time and effort.

That problem generalizes to most online reviews. Research shows many of today’s most popular online review platforms — including Yelp business reviews*, and Amazon product reviews — have a distribution of opinion that is highly polarized, with many extreme positive and/or negative reviews, and few moderate opinions. This creates a “bi-modal” or “J-shaped” distribution of online product reviews that has been well-documented in the academic literature. This makes it hard to learn about true quality from online reviews.

One area where online reviews are particularly important to the economy is the high-stakes decision of choosing a job. One recent survey found 48% of job seekers in the U.S. today rely on online employer reviews from Glassdoor, the jobs site where two of us work, as part of their job search process — a huge fraction of America’s 160-million person labor market. Relying on biased online reviews of employers could be a costly mistake — both for job seekers and employers.

How can online review platforms motivate this “silent majority” of middle-of-the-road voices to post reviews and collectively provide a more accurate picture?

In our new study, we tested whether simple incentives can be used to provide more accurate online reviews. The study combined two approaches: A controlled online experiment, and real-world data from the online jobs site Glassdoor.

Offering Incentives in the Laboratory

In our experiment, we gathered a group of online participants and asked them to leave reviews of their employer. We then tested several types of incentives — both monetary incentives, and “pro-social” incentives, in this case reminders that leaving a review would help other job seekers — to see how online company reviews changed with each. For the experiment, we used Amazon’s MTurk marketplace; all participants were paid $0.20 each to review their employer, and some were paid more, to test the effects of extra monetary incentives.

Our results show that people are more likely to leave online reviews when they’re reminded that doing so helps other job seekers. Simple pro-social incentives also led the distribution of reviews to be less biased, creating a more normal bell-curve distribution of reviews.

We also tested the impact of monetary incentives — paying participants extra to leave employer reviews. We found monetary incentives can also work, but only if they are high enough. As the size of monetary payments rises, so does people’s willingness to post online reviews — even those with moderate opinions who would otherwise remain silent. In our experiment, offering $0.15 extra — a 75% payment increase to participants — was enough to reduce bias in reviews.

Reviews in the Real World

Do our experimental results also hold up in the real world? To test that, our study also looked at an online incentive program on Glassdoor.

Glassdoor receives user content in two ways. First, users may voluntarily submit reviews of their employer, salary, and other job information. Second, Glassdoor also uses what’s known as a “give-to-get” policy that provides a strong incentive for users to provide content: After viewing any three pieces of content online, users are asked to submit their own review back into the online community before being able to view additional information.

We examined whether this real-world incentive policy changed online opinion about companies. Just as in our lab experiment, we found the distribution of online reviews left voluntarily differed markedly from those left by users who were given an incentive to leave reviews. The distribution of voluntary reviews was significantly more extreme — with many more positive and negative opinions of companies — than the more moderate distribution of incentivized reviews. Both in controlled experiments and in a real-world business setting our research shows that providing monetary and pro-social incentives can lead to more balanced and representative online reviews.

Why Online Views Matter

Correcting bias in online reviews can sway important decisions in our economy. Imagine a job seeker deciding between a similar job in two industries, such as consulting versus advertising. Our research shows that the ranking of industries in terms of online ratings often flips based on whether reviews are left voluntarily or in response to an incentive. For example, job seekers relying only on polarized voluntary reviews may believe consulting is a less desirable industry than advertising, when a more balanced set of incentivized reviews offers the opposite conclusion. In this way, biases in the distribution of online reviews can affect real-world economic decisions by distorting the information consumers, job seekers, and investors rely on.

Online reviews are a powerful tool for sharing information at scale. But it’s important to remember the source — many online reviews today are from those who’ve voluntarily decided to share opinions, giving a distorted view of products, services and companies.

Our research offers hope for making online opinion-based reviews more socially useful. Using relatively inexpensive incentives, our research shows online platforms can dramatically reduce bias and encourage more moderate voices to join the online conversation. Online reviews are a fundamentally social endeavor. Simply reminding individuals of the pro-social benefits they provide by sharing opinions online can have a powerful impact on their willingness to leave online reviews and improve the usefulness of today’s ubiquitous online review platforms.

*This article was updated to change the link mentioning Yelp’s review distribution. Although Yelp itself reports that nearly half of reviews written across all categories are 5 stars, the research this article originally linked to shows that the distribution of reviews displayed to customers after Yelp removes fraudulent and low quality reviews is significantly less extreme. Yelp employs various non-financial incentives for their reviews.

Nadav Klein is a post-doctoral scholar at the the University of Chicago Harris School of Public Policy.

Ioana Marinescu is an assistant professor of economics at the University of Pennsylvania School of Social Policy & Practice.