Survey Says Yelp ‘Best Quality,’ ‘Most Trustworthy’ Local Review Site

Late last night Yelp released the results of a Nielsen survey of approximately 1,000 “local review site users.” The survey results identify Yelp as the category leader according to a number of criteria such as usage frequency, quality, trust and influence.

I haven’t yet had a chance to look at the full Nielsen survey results. Yelp announced the findings on its blog.

It appears that the universe of sites compared in the survey were the following:

Angie’s List

Citysearch

OpenTable

TripAdvisor

Yelp

YP

Zagat

Exposed findings:

Most frequently used review site:

Yelp — 44%

TripAdvisor — 12%

Angie’s List — 8%

Citysearch — 6%

YP — 4%

Most influential when making final purchase decisions:

Yelp — 37%

TripAdvisor — 13%

Angie’s List — 8%

Citysearch — 4%

YP — 3%

Most trustworthy reviews:

Yelp — 25%

TripAdvisor — 17%

Angie’s List — 9%

Citysearch — 4%

YP — 3%

Best quality reviews:

Yelp — 24%

TripAdvisor — 20%

Angie’s List — 9%

Citysearch — 3%

YP — 2%

The survey also found that 85% of Yelp users report making a purchase within a week of using the site and a substantial minority do so within a day.

Some thoughts and caveats:

The methodology discussion says the survey was commissioned by Yelp. It also says the sample was weighted for age and gender to be representative of review site users. However (and it’s hard to read) it appears that the sample includes “668 self-reported Yelp users.” So there’s a possibility at least of some bias in the sample.

Google was not included in the survey choices. Google arguably does qualify as a review site given its effort to become one over the past several years. In addition, OpenTable and Zagat are mentioned in the blog post but not in the main results exposed (above).

Accordingly, I’m asking for clarification from Yelp on the methodology and asking to see the full survey results.

What are your reactions to this survey? Do you agree? Are you surprised in any way?

Update: Mike Blumenthal pointed me to a survey he fielded using Google Consumer Surveys. The population was people who create local business reviews. The following data (n=1,002) responds to the question “When you leave a review online for a local business which sites are you likely to use?”

While this answer is somewhat different than the questions asked above it does appear to at least partly contradict the Nielsen findings.

Update 2: Yelp provided some answers to the questions above. I’ve included those in my post at Search Engine Land about the study.

Yelp is certainly the category leader by some of this criteria since its user base is large, highly engaged, and produces lots of rich reviews with high visibility across the Web. Yelp’s past insularity has loosened somewhat now that its data is being syndicated to Yahoo and Apple Maps.

But as you point out, Yelp’s biggest competitor among the masses (Google) is conveniently missing from this “study.” Moreover, the study reinforces the relative positions enjoyed by other review sites and the idea that the local review “ecosystem” still matters. Especially when you consider that consumers today typically check multiple review sources to triangulate on the truth when making a purchase decision.

Surprised this is even news. Of course Yelp is going to make itself look good in comparison to other smaller review sites. How could any self-respecting survey company that asks these questions leave out Google and Facebook, regardless of who commissioned it?

[…] that it did not include Google+ and Facebook. As local industry expert Greg Sterling mentioned in his review of this study, a survey (n=1,002) by Mike Blumenthal showed Google and Facebook as a more frequent response to […]

Greg: I did some independent research that will end up showing some very different statistics than what Mike’s survey showed. Soon to be published. But who knows on the very grand scale.

It is of course absurdly obvious that a business that pays for research will end up with results that highlight its value. Of interest are the questions asked and the non-answers, re google and facebook.

Businesses put out this hype. Some will believe it. Some will disregard, and …LOL, most won’t ever see it…including most in the “target audience” for this type of information.

On a qualitative basis, from the perspective of our smb’s of various types…we get feedback from customers that yelps reviews are certainly read and valued. We haven’t bothered to evaluate which sites reviews are considered “best”. We get additional feedback that other sites are also valued and read by customers.

In this arena on your blog, with your readership I’m sure the yelp “news” will get a lot of skepticism…and it should. They commissioned the survey, they paid for it. Does anyone with that knowledge expect anything but a glowing report for yelp???

But a few people will “buy into it”. Isn’t that the purpose of this type of bogus “newsworthy events”???

There’s no question Yelp is important. They selectively released data, which suggests there’s other stuff that may not be quite as flattering in there. Also the way questions are framed impacts the answers. But clearly Yelp is a brand with a lot of consumer influence.

Yelp commissioned the report? Should we consider that the same as paying for a review or asking for a review – both tactics that violate Yelp’s terms. Besides the data is so skewed towards Yelp… Angie’s List, Yelp and Trip Advisor do not even cater to the same demographic or users… I think of Yelp when I want to go to a restaurant – I think of Angie’s List when I want to remodel my home and I think of trip advisor when I am planning a vacation. More people go to restaurants then remodel their homes or go on vacations….

This study is a bit of a bust IMHO. As David Mihm said – I’m surprised this is even news too.

It’s news because the organization behind the survey is Nielsen and it’s a comparison of local consumer behavior/attitudes. I don’t think the survey was designed to produce this outcome necessarily but as I said w/o seeing the questions and the full data set we can’t fully assess it. Yelp has only partly exposed the data. I suspect that all the data aren’t equally favorable to Yelp. Seeing the questions would be helpful but they declined to share the full survey with me.

Greg, I have to say this is terrible for Nielsen. This kind of opacity makes me distrust ALL of their data going forward. I guess at this point they are so desperate to protect any revenue stream, they’re willing to just be shills for the people who commission their surveys?

As far as what I’d characterize these results as, that would be “Yelp advertising” unless they’re actually willing to share the full results.

Since the data are “owned” by Yelp it’s really Yelp’s decision whether or not to release it all. It’s not uncommon for commissioned studies like this to be released in pieces. But because of the inherent questions surrounding a study like this Yelp should have released more data “up front.”