Every year, a few movies (and TV shows) garner an unusually high volume of terrible ratings on IMDb. The first time I took note of such “review bombing” was in 2016 with the release of the female-led Ghostbusters (2016) reboot (source: https://screencrush.com/ghostbusters-imdb-what-the/). The ratings were so “unusual” that IMDb decided to evaluate the film’s rating with an “an alternate weighting calculation” (source: https://www.imdb.com/title/tt1289401/ratings/?ref_=tt_ov_rt) and it now has a solid 6.9/10. More recently, Marvel’s latest She-Hulk: Attorney at Law suffers from 36% of all its ratings being 1/10s. A back-and-forth of “this screams misogyny” and “well why don’t Wonder Woman and Kill Bill also have low ratings then?” plagues my social media. I won’t voice my opinion on the matter right away, but I will say it’s interesting that Ghostbusters (2016) and She-Hulk are co-written by women whereas Wonder Woman (2019) and Kill Bill (2003) are written by men. Though we could come up with women-penned movies appreciated by IMDb users (like Bridesmaids, a personal favorite), it does not mean that the attacks on She-Hulk and Ghostbusters are not founded on misogyny. A lot of misogynists like women, as long as they fit in their box of what a woman should be and do. Maybe She-Hulk and Ghostbusters are outside of that box. Or maybe they caught the wrong people’s attention. Again, I won’t voice my opinion on the matter.
I will, however, make use of my amateur web scraping skills to look at all movies on IMDb from the past two decades (2000-2019) and see which ones receive the highest amounts of “hate” to see if there are any noticeable patterns. Levels of hate will be measured by the proportion of IMDb ratings that are 1/10s. Of course, a movie with a high proportion of 1/10s could simply be a terrible movie (as the “What about Kill Bill and Wonder Woman” crowd would say), so we would need some proxy - independent of IMDb rating - measuring movie quality to differentiate the terrible movies from the not terrible movies. I propose Metacritic’s Metascore Ratings.
I like Metascore for three reasons: 1) they are independent of IMDb ratings since they are based purely on critic reviews, 2) they are posted within movies’ IMDb pages so it makes scraping the data easier for me, and 3) critics gain their livelihood reviewing movies as objectively as possible. But again, critics must also have their own biases, liking art house films more than the average moviegoer, and blockbuster films less. This can be a slippery slope since the whole premise of this analysis is that the validity of a movie-rating system should be questioned. Based on personal experience, however, I believe the magnitude and frequency of Metacritics’ biases are less than that of IMDb users. Even if you disagree with that, the only assumption about Metacritics I make for the analysis to be valid is that a movie well-liked by Metascore is not among the worst movies I have seen (contradictory to what a 1/10 rating on IMDb would suggest), and that a terrible movie on Metascore might be.
I believe this assumption to be true. Maybe you don’t. If it helps convince you, below is a chart of the Metascore ratings of the 2400+ movies released between 2000 and 2019 that received 50K+ IMDb ratings; please feel free to hover over movies have high, medium, and low metascores. Or in the table below, feel free to look up the title of any movie in the Search bar to see its Metascore rating.
Note: in the future, I plan on investigating more thoroughly the differences between IMDb, Metascore, Rotten Tomatoes Critics and Rotten Tomatoes Audience scores to better identify each platform’s biases.
As stated before, this analysis looks only at films released between 2000 and 2019. Movies with fewer than 50000 ratings on IMDb are removed so that the films we investigate are more likely to be known by the average moviegoer. Of course, we also removed all movies not rated by Metascore. 2417 movies on IMDb fit this criteria. We break these movies down into three groups based on Metascore: 1. Metascore between 61-100, 2. Metascore between 40-60 and 3. Metascore between 1-39. These groups are based on Metacritic’s colour-coded rating system (https://www.metacritic.com/about-metascores). Movies in Group 1 received Generally Favorable Reviews or Universal Acclaim and are green in colour, movies in Group 2 received Mixed or Average reviews and are yellow, and those in Group 3 received Generally Unfavourable Reviews or Overwhelming Dislike. From hereafter, the 3 groups will be referred as Favourable, Mixed, and Unfavourable, respectively for brevity’s sake.
The pie chart below breaks down the 2417 movies movies into the 3 groups. 44% (1062) are green, 40% (972) are yellow, and the remaining 16% (383) are red.
Finally, please find below the 10 movies movies with the highest proportion of 1/10 ratings on IMDb, for each Metascore-stratified group:
out of all not terrible movies (Metascore >= 61), we find the ones with the greatest proportion of “hate” (% of 1/10 IMDb ratings):
–>
–>
–>
–>
–>
–>
–>
In this analysis I seek to answer just that: what kinds of movies yield a lot of hate on IMDb despite it being a fine film? Levels of hate will be measured by the proportion of ratings that are 1/10s, and a movie will be determined to be fine if it has a Metascore rating of 61 or higher. A threshold rating of 61 was used based on Metacritic’s own website, claiming that movies rated 61-80 receive “Generally Favorable Reviews” (https://www.metacritic.com/about-metascores).
I like Metascore ratings for two reasons: 1) they are independent of IMDb ratings since they are based purely on critic reviews, and 2) they are posted within movies’ IMDb pages so it makes scraping the data easier for me. But why would Metascore be a good proxy for movie quality? Critics must also have their own biases, liking art house films more than the average moviegoer, and blockbuster films less. Based on personal experience, however, I believe the magnitude and frequency of Metacritics’ biases are less than that of IMDb users. Even if you disagree with that, the only assumption about Metacritics I make for the analysis to be valid is that a movie with 61 Metascore or higher will not give me one of the worst movie experiences I will have (contradictory to what a 1/10 rating on IMDb would suggest). I believe this assumption to be true. Maybe you don’t. If it helps convince you, below is a chart of the Metascore ratings of the 2400+ movies released between 2000 and 2019 that received 50K+ IMDb ratings; please feel free to hover over movies have high, medium, and low metascores. Or in the table below, feel free to look up the title of any movie in the Search bar to see its Metascore rating.
Sometimes these extreme reviews come in before the movie is even released to the public, clearly indicating some agenda founded on prejudice. Even the review bombers who have seen it may have been prejudiced, making it harder to differentiate a genuinely awful movie from a fine movie with louder haters than fans.