Data Analysis of Movie Critics vs Movie Fans
The skills I demoed here can be learned through taking Data Science with Machine Learning bootcamp with NYC Data Science Academy.
Objective
A data analysis on how the top 100 movies from Metacritic and IMDb compare to one another and what that reveals about what factors a fan evaluates when rating a movie versus a critic.
Introduction
When you are looking for a movie to watch, I bet you do a little diligence. I know this because if you're reading this blog post you're interested in or involved in Data Science or the NYCDSA, and that means you use data to make everyday decisions.
Many consumers use Yelp to find a new favorite dining spot or prowl through reviews to find the highest quality product on Amazon, but you take it to the next level. You click on each Reviewer. You try to understand them better, what other things have they reviewed, do they generally review things positively or negatively? If they generally review things negatively, does that mean they have too high standards or, the opposite, poor taste and judgment? If things seem too positive, you may wonder if this individual is being paid to write these favorable product reviews.
What if, on average, the reviewer seems to deviate from the consensus rating/opinion on the subject, does that mean they're hipster? If you're also a hipster that may be a good thing, but I digress...
Methodology
IMDb
Both IMDb and Metacritic rely on their consortium of users and subject matter experts to evaluate movies and other entertainment to provide a reliable and limited-bias metric for that very consortium. Just as your intuition as a Data Scientist tells you to look beyond a raw mean of ratings/number, IMDb and Metacritic give different weights and significance to some user ratings versus others. Let's look at IMDb's rating methodology first.
โWe take all the individual ratings cast by IMDb registered users and use them to calculate a single rating. We don't use the arithmetic mean (i.e. the sum of all votes divided by the number of votes), although we do display the mean and average votes on the votes breakdown page; instead, the rating displayed on a title's page is a weighted average
IMDb publishes weighted vote averages rather than raw data averages. The simplest way to explain it is that although we accept and consider all votes received by users, not all votes have the same impact (or โweightโ) on the final rating.
When unusual voting activity is detected, a different weighting calculation may be applied in order to preserve the reliability of our system. To ensure our rating mechanism remains effective, we don't disclose the exact method used to generate the rating.โ [imdb.com]
Metacritic
Now we visit Metacritic's methodology.
"Metascore is a weighted average in that we assign more importance or weight, to some critics and publications than others, based on their quality and overall stature. In addition, for music and movies, we also normalize the resulting scores (akin to "grading on a curve" in college), which prevents scores from clumping together." [metacritic.com]
Now that we have established that the ratings we see on either website are more than a straightforward raw mean score, let's proceed to analyze the differences in the data between them.
Important considerations of the data:
-The data examined is a set of the top 100 movies on IMDb and a set of the top 100 movies on Metacritic
-IMDb rating scale 0.0-10.0 with approximately 1-2 million user inputs
-Metacritic rating scale is 0-100 with approximately 10-30 movie journalist inputs
First, we look at the distribution of ratings for both:
IMDb Data
Metacritic Data
We notice immediately that there is rightward skew, which intuitively makes sense. Higher scores should be harder to obtain, especially if you were to visualize a normal distribution curve of the entire collection of movies.
Data on Distribution of Release Dates
We move on to something more interesting, the distribution of the release dates of each set of top 100 movies.
Analysis
As we can see, Metacritic favors older films more than IMDb, with a smaller mean and median. We also observe two peaks in the Metacritic distribution indicating a strong preference for movies before 1980 and movies after 2000. Interestingly, though this time period between 1980-2000 has limited favorability by Metacritic, it is actually favored the most by IMDb with 60% of the top 100 movies occurring in that time period.
Moving on, we wonder if there are any overlapping movies between the two lists. Initially, in the top 10 and top 25 sets, there is little. Only one movie, The Godfather (1972), is shared between them. However, as the sets expand to contain the top 50, top 75, and top 100 movies the overlap grows from 6, 9, and 16 movies respectively.
Top 50
Top 75
Top 100
Finally, we examine directors with multiple films (2 or more) in the top 100 lists.
Data on Directors
Leading the charge in the IMDb list, are Christopher Nolan and Stanley Kubrick with a staggering 7 films in the top 100. These top 11 Directors make up 40% of the IMDb list while the top 11 directors of the Metacritic list would make up only 16% of the list.
We end this analysis by asking, what might explain the large differences in the two sets?
Obviously, critics and movie fans use different criteria to evaluate a movie. Critics will use more technical analysis of the directing, cinematography, editing, and storyline than your average movie fan. Movie fans may rely more on visceral measures like emotional responses, special/visual effects, and simply the gut feeling one has at the end of the movie as the credits roll. Instead of conjecturing the wide array of criteria that is used across groups, let's identify a theme that both groups agree on which is cultural impact and the nuances of perspective within that theme.
IMDb evaluates cultural impact in real-time, as users flood the database with scores following a release of a major blockbuster movie like Avengers, which rose to the top 5 all-time list at a certain point closely following its release.
Metacritic evaluates cultural impact in hindsight, evaluating movies in respect to their relevance in that time period as well as their place in cinematic history. Movies that capture the zeitgeist are often featured in Metacritic such as Moonlight, Lady Bird, and Grapes of Wrath. Moreover, movies that left a lasting impression on the industry by changing the way films were directed and produced are common. This is evident through the high quantity of movies highly reviewed before 1960 and as far back as 1930 by directors like Alfred Hitchcock and Charlie Chaplin.