Why Rotten Tomatoes scores don’t mean what they seem

August 29, 2019

Chances are you’ve seen this tomato before. It’s become ubiquitous — and quite contentious This chart helps to explain why. We’re releasing more films now than ever
before. And in a world of excess choice, people need
guidance to make tough decisions. Which is why we need services like Rotten
Tomatoes The internet staple got its start in the late
90s. And in 2016, Fandango bought its parent company. Now, you go to buy a ticket, and there it
is. Which makes that rating important to understand. Because the tomatometer — it’s more complex
than you might expect. Films can earn one of three designations:
rotten, for movies rated 60% of critics gave a positive review. Fresh, for those earning a rate above 60% Or Certified Fresh. That’s reserved for films that were reviewed
80 times and 70% or more of the reviews are positive. 5 of those reviews need to be from top critics. Critics submit a review with their own rating,
or sometimes Rotten Tomatoes asks the critic if it’s positive. If it’s borderline, Rotten Tomatoes usually
says the review is fresh. Rotten Tomatoes depends on a small army of
reviewers to make the tomatometer work. There’s about three thousand critics that
are counted right now though not every critic reviews every film so it’s usually a few hundred
per film. That’s Alissa Wilkinson — she’s a staff
film critic at Vox.com. Which means her reviews count toward the official
Tomatometer. But the nuance in Alissa’s writing is largely
reduced to the rating you’ll find near the top of her articles. Because Rotten Tomatoes uses a thumbs up thumbs
down method on everyone’s reviews it means that it kind of makes a vaguer statements
of consensus. we don’t get a sense so much of people who
have mixed ideas about a film. Look at these two films: Ridley Scott’s
Alien: Covenant, and Barry Jenkins’ Moonlight. Both films are certified fresh, but the similarities
end there. Alien was an underwhelming blockbuster sequel
with a reported budget of 97 million dollars. Moonlight was an Oscar winning drama from
a fledgling director with a budget 1/24th the size of Alien’s. Both films achieved the blanket consensus
needed for the certified fresh badge. Alien finished with a Tomatometer at 70% — toward
the low end of the certified fresh spectrum. Moonlight received a 98 percent Tomatometer
— near total consensus. But Alien was rated 6.4 out of 10 on average,
after Rotten Tomatoes converted critical star ratings, letter grades, and number scores
to its 10 point scale. Moonlight, on the other hand, earned an average
rating of 9 out of 10 per review. Most critics loved it and agreed with one
another. So the two films earned the badge, but were
qualitatively world’s apart. This demonstrates the imperfection of the
Rotten Tomatoes system. This imperfection also appears when you compare
two of 2017’s most critically acclaimed films. Here’s another scenario:
Both Christopher Nolan’s Dunkirk, and Jordan Peele’s Get Out are highly rated and certified
fresh. But according to the Tomatometer, Get Out
edges out Dunkirk by 6 percentage points. If you saw these scores on the Fandango purchase
page, you might think that critics rated Get Out higher than Dunkirk. The Rotten Tomatoes page for each film shows
that Dunkirk earned a higher average rating per review. Dunkirk earned a lower tomatometer because
there was less agreement among critics — more variance in the data. And when there is less consensus, the rating
is lower. But a cute single tomato rating just can’t
give you all that information. Other rating systems try to circumvent these
problems with their own methodology. Metacritic, the most visible aggregator aside
from Rotten Tomatoes, is very subjective. It casts a much smaller net than Rotten Tomatoes,
and generally does more interpretation and weighting in their scoring. Metacritic is also less transparent about
their rating system than Rotten Tomatoes. So, is there a one-size fits all, killer method
to get digestible and accurate reviews of film in a fraction of time than it logically
could take? Absolutely not. That’s preposterous; the whole point of the Tomatometer is to help
you make a decision quickly. If you want context, you click and then you
read. Or, watch. And in a world of limited time and excess
choice, we all benefit from a bit of guidance. Just make sure you know how your guide is
getting you there.

Metacritic is "very subjective" and "generally does a lot more interpretation." Can you please elaborate? You kind of just put that out there without an evidence to back it up. After researching the histograms of the four possible distributions of ratings for a large sample size of reviews for a single movie, Metacritic's histogram most closely resembles a normal distribution (a bell curve). IMDB comes in at a close second, Fandango seems to be heavily skewed towards the right, and Tomatometer looks like the opposite of a bell curve (or normal distribution).https://medium.freecodecamp.org/whose-reviews-should-you-trust-imdb-rotten-tomatoes-metacritic-or-fandango-7d1010c6cf19

I think the tomato meter is underrated because the live action TMNT movies, I thought they were good, although the Micheal bay Transformers movies didn’t go with the original, I thought they were awesome, the rotten tomatoes are there just to tell people not to spend our money on something like good movies such as captain marvel

What is the point of this video, that the public's opinion of a film differs from those of critics? We've known this for forever. Critics have a very different scoring standard and they have drastically different metrics with which to judge films. The public often judges a film by solely how much they enjoyed it and now much else, while critics take into account the production, the visuals, acting, technical ability and more. It's only natural that they aren't exactly the same.

I don’t really agree with a lot of the critics on things. Rotten Tomatoes seems to be worse to me because audience scores aren’t filtered into the main score, only critic scores. I don’t think a higher rating should make your decision. A bad movie in one's eyes might be a great movie in another's eyes and vice versa. Sure, I like to read the reviews, but only for the fun of it. I have thought about becoming a movie reviewer for fun, but I wouldn’t be a very serious one. You wouldn’t really be able to call me a critic with my reviews.

I don't need rotten tomatoes as a guide to what movie I wish to see. If a movie looks interesting to see based on its ad then I would see it. In other words I would see a movie based on my own judgement and not by someone else.

Dis is why idl rotten tamatoe as much cause most of the movie they review aren't that accurate i just downloaded fandango on my phone and im a huge fan of godzilla and im extremely excited for godzilla:king of the monsters and i saw the rotten tamatoe reviews and instead of being fresh it was rotten and that pissed me so much cause the people on the audience score were positive reviews.

I love rotten tamatoe it just that they must find a way to be more accurate.

rottentomatoes are the worst. There is no distinction between very good movies, good movies and above average. Black Panther and Spider-verse is a good example. Both had 97% but Spiderverse had 8.5 imdb but Black Panther just 7.3.

90% or More in rotten tomatoes of the movies have bad ratings. Is the worst thing to use before watch a movie. Just use Imdb or google % liked because they are rated by regular people how watch the movies.

Excuse me! But you forgot the ranking where the critics been snorting mold that ended with a totally opposite result. From the looks of it Rotten Tomatoes still hasn’t learned its lesson of the after affects mold addiction! 😒