IMDb has a 10-star movie rating system. This allows users to vote with a whole range of different appreciations. However, as far as I know, there is nowhere any explanation of what a score of ‘3’ or ‘9’ is supposed to mean. IMDb leaves it up to the user to decide how to rate a movie. Many people seem only able to give ratings of either 1 or 10, which is why you'll often see peaks in the vote histogram there. Most other people are still biased towards the lowest and especially highest scores. I've seen reviews where people summed up a whole list of bad points about a movie, and still gave it a 10. Eh?
In an attempt to arrive at a fair and balanced rating system, I created the guidelines below. Actually the only reason why I put this online is so I always have access to this ‘cheat sheet’ when I want to rate a movie. It is not my goal to impose this rating system on anyone, but if you've always felt unsure about what score to give a movie, it could be a useful guideline. Just don't assume that everyone rates movies according to these rules.
Since the lowest scores are probably the hardest to decide upon, here is some more clarification. Films that score 1 or 2 are the ones that would make you walk out of the cinema and demand a refund, or switch the channel on the TV (or in retrospect, make you wish you had done that). 3 is just not bad enough to make you walk out of the cinema, but you'll still feel ripped off, the same goes for 4 but to a lesser degree. When watching a 3 or 4 movie on TV, you would only not change the channel if you knew there was nothing better on. 5 is the threshold for not feeling ripped off.
Of course this scoring system is not strict. A movie that perfectly matches one of these descriptions could still get +1 added to its score if it has something that lifts it higher, like being the first movie to introduce a really original plot element, or bringing an important message in an unobtrusive way. Or, it could have points subtracted if it has something that really offends you even though the rest of the movie is really good.
You should however never completely pan a movie just because of a single negative point. You should also not increase or decrease your score to balance out other reviewers,
that's not how a voting scheme is supposed to work. You should especially not subtract points because for instance a 1973 sci-fi film shows outdated predictions of the future. If you don't understand why, you should stick to watching recent films that fit within your small present-day universe.
Due to lack of a clear definition of the ratings, probable manipulation of the rating system by film studios, and the skewed way in which the average consumer casts their votes, it is pointless to judge a film by its IMDb rating alone. The voting histogram accessible by clicking the user count underneath the stars, is a lot more helpful.
The first histogram shows how many votes were cast for each score. The largest peak and the median are generally better indicators than the average. The second histogram shows average votes for various population groups. Every person will have to learn to look for personal indications in this histogram that a movie might either be great or to avoid. For instance, in my case “Females under 18” and/or “Females Aged 45+” giving significantly higher scores than other groups, is often a sign to stay away.
The whole problem with the ‘useful’ meta-rating for reviews, is that the system itself is not useful by any stretch. It is implemented in a naïve manner that encourages manipulation, and generally causes the first few reviews ever posted for a film, to appear as the most useful reviews forever. The system calculates a weight based on the number of useful versus not-useful votes, and uses this weight to favour the most ‘useful’ reviews to appear on the film's main page, and for other ranking purposes. The usefulness estimate for a review will plummet instantly if the first few voters thumb it down. Unless the system only has a small number of reviews to choose from because only few have been posted yet, the thumbed-down review will then have no chance to compete with other reviews to appear on the main page, and be stuck in a digital oubliette forever.
This means it is worth it for a studio to hire an army of drones who regularly skim through recently added reviews for their own precious productions, and vote up/down the positive/negative reviews. Of course IMDb prohibits this, but making this undetectable is peanuts, therefore I am certain it happens. You can try it yourself: do your utmost best to write a review for a recent blockbuster that perfectly explains why you liked or disliked it. Wait one week and then look at how many ‘useful’ votes your review received. If your review praised the film, chances are your review will have been voted near 100% useful. If it panned the film, your review will likely have been panned as well. If it is neither scathing nor full of praise, you might get something in between.
Of course, it is not unthinkable that the same drones are also encouraged to apply the inverse strategy to films of rivalling studios. Moreover, there are obviously also quite a few visitors who vote honestly. The situation will therefore never be as black-and-white as I picture it above. Still, if IMDb would ignore and hide the ‘useful’ score until it has some statistic significance, and give reviews a fair chance until they are more certain to be crap, the whole review system in itself would be a lot more helpful.
The bottom line is: don't even bother reading the review on a film's main page: laugh in the face of the film studio drones and ignore it. Immediately click through to “See all user reviews” and use the ‘filter’ function. I really like the ‘Love/Hate’ setting which pits the most positive against the most negative reviews. Also skim through the reviews with the filter set to ‘Chronological’: this will give you a better idea of how reviews are distributed.