Recommend
 
 Thumb up
 Hide
14 Posts

BoardGameGeek» Forums » Introduction » New User Questions

Subject: Ratings? rss

Your Tags: Add tags
Popular Tags: [View All]
Sam Yhanto
United States
Richardson
TX
flag msg tools
designer
Avatar
mbmbmbmbmb
What's the difference between a "Geek Rating" and an "Average Rating"?
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Gary Sonnenberg
United States
Waukesha
Wisconsin
flag msg tools
badge
Avatar
mbmbmbmbmb
See this explanation in the wiki: http://geekdo.com/wiki/page/ratings
6 
 Thumb up
1.02
 tip
 Hide
  • [+] Dice rolls
Tim Benjamin
United States
Los Alamos
New Mexico
flag msg tools
badge
Avatar
mbmbmbmbmb
The former is based on bad math, the latter is based on simple math.
3 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Sam Yhanto
United States
Richardson
TX
flag msg tools
designer
Avatar
mbmbmbmbmb
glsonn wrote:
See this explanation in the wiki: http://geekdo.com/wiki/page/ratings


So from what I can gather from this link, "Geek Rating" occurs after a game has been rated 30+ times, correct?

If so, then the question I seem to be left with is: "What are 'dummy' ratings'?

RaffertyA wrote:
The former is based on bad math, the latter is based on simple math.


How do you mean?
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Merric Blackman
Australia
Waubra
Victoria
flag msg tools
designer
Ramping up my reviewing.
badge
Happily playing games for many, many years.
Avatar
mbmbmbmbmb
The Geek Rating is a Bayesian Average.

It adds in a few "dummy" votes (of the average rating on the geek for all games) so that games that haven't been voted on that much are rated as closer to the average rather than games that have a lot of votes (a large sample size).

Basically, if a game has only a few votes (and we're talking fairly large values of "a few" here), its rating really can't be trusted. It's not a good indicator of how it will eventually be ranked. So, the Bayesian average is used, which raises poorly-rated games and lowers well-rated games until they get enough votes.

To explain further - and to give a simple example:
Imagine one game has two votes: 10 and 9. It would have an average of 9.5.

Imagine another game has 1000 votes, with a final average value of 8.

By using a strict rating on the average, then the game which has only two votes would rate higher than the 1000-vote game... but you'd have to say the count is unreliable.

The Bayesian calculation adds in (say) 100 votes of (say) value 6.5 each. What you get then is the two-vote game ends up with an average value of 669/102 = 6.6, and the thousand-vote game ends up with an average value of 8650/1100 = 7.9.

As more votes are tallied, the ratings get more and more accurate.

Cheers,
Merric
9 
 Thumb up
1.00
 tip
 Hide
  • [+] Dice rolls
Sam Yhanto
United States
Richardson
TX
flag msg tools
designer
Avatar
mbmbmbmbmb
Ah, thank you.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
V. M. G.
Brazil
flag msg tools
Avatar
mbmbmbmbmb
Let's see if I can explain and someone correct me if I'm wrong here:

Let's say I post a new game, ok? And all of my friends (30 of them, at least) come here and rate it 10. Now, is it fair for a homebrew game to have a mean rate of 10 while Agricola has 8?

It isn't. And to stop a few votes from launching a game to the top of the ranking the "geek rate" system works like this: each game's "Geek rating" is computed as if it had received a hundrer or so rates of 5.5. See the difference? Those are the dummy votes the FAQ mentions.

Assuming its a simple mean, the formula is : the sum of all rating votes divided by the number of ratings. Hence if a game has only one vote for 10, the raw mean is: 10 (the rating) divided by 1 (vote) = 10
In my hypothetical scenario, 30 friends voted ten, so the game would have a score of: (30 times 10) divided by 30 = 10

Now, if you include those 100 dummy votes you'd have:
30 times 10 (my friend's votes)
plus
100 times 5.5 (the dummy votes)
divided by 130 (one hundred dummy votes + 30 friend votes)
=
((30*10)+(100* 5.5))/130 = 6,53

And that's how it works.

edit: oh, it got answered as I wrote this. No matter. =)
5 
 Thumb up
1.00
 tip
 Hide
  • [+] Dice rolls
Tim Benjamin
United States
Los Alamos
New Mexico
flag msg tools
badge
Avatar
mbmbmbmbmb
This is an incorrect application of the Bayesian method. The 'dummy' votes should be an 'experts' guess as to the final Average of an individual game, NOT the average of some marginally related populations. And there are other aspects that are just improper/wrong.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Thomas Staudt
Germany
Rutesheim
Baden-Württemberg
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
RaffertyA wrote:
This is an incorrect application of the Bayesian method. The 'dummy' votes should be an 'experts' guess as to the final Average of an individual game, NOT the average of some marginally related populations. And there are other aspects that are just improper/wrong.


Then let's just _not_ call it Bayesian.

It can't be "improper/wrong" as it's the method _defined_ for the ratings.
It just is what it is.

You could argue that "there a better / more useful / less manipulation-prone ways" to make a ranking for board games, but that is another discussion.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Tim Benjamin
United States
Los Alamos
New Mexico
flag msg tools
badge
Avatar
mbmbmbmbmb
I did not label it Bayesian, BGG did. So instead of improper/wrong I can substitute 'useless'.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
James Webb Space Telescope in 2018!
United States
Utah
flag msg tools
Avatar
mb
I'm often annoyed at the Geek rating and the rankings.

For example, Runewars has a very high average rating by tons of people, but it's still on ranked only 60, because it's not yet rated by thousands of people. How many ratings do you realy need to know a game is good? 700 isn't enough?

However, I think BGG is trying to use the Geek rating/rankings to try to come up with some way of dealing with all the tons of games here. Preferring very popular (thousands of ratings) games that also have very high ratings seems like a reasonable solution that BGG has found.

I'm sure Aldie and the other admins have thought a lot about the ratings and they probably aren't going to change the system based on a few gripes in a thread like this.

 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Jeff Wolfe
United States
Columbus
Ohio
flag msg tools
Zendo fan, Columbus Blue Jackets fan, Dominion Fan.
badge
Avatar
mbmbmbmbmb
tesuji wrote:
For example, Runewars has a very high average rating by tons of people, but it's still on ranked only 60, because it's not yet rated by thousands of people. How many ratings do you realy need to know a game is good? 700 isn't enough?

No, it really isn't. It represents only a small fraction of the BGG user community. Aldie and Company want ratings that represent the community at large, which that does not (yet).

Ratings here are from self-selected samples. That is, people decide for themselves what games to play and rate. The first players to rate a game are more likely to be people who are excited about the game from its description. Assuming it is done well and marketed correctly, those people will rate the game highly. As time goes on, those early adopters will introduce it to their friends and their friends' friends. At the same time, as buzz grows, people will begin to play it just to see what all the fuss is about. Those groups are more likely to rate it lower than the initial enthusiasts, because it's not as closely aligned with their tastes and preferences. Over time, the average rating will decline.

And the data supports that. If you ever look at the analysis people have done of popular games' ratings over time, most show the same trend: their rating shoots up quickly, then gradually declines over time. The same thing happens in the rankings, even with the adjustments BGG makes to try to reduce the effect.
4 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
DC
United States
Grand Rapids
Michigan
flag msg tools
badge
Avatar
mbmbmb
Quick summary:

The ratings are 1-dimensional summaries of extremely complex, varied, and inconsistent human preferences.

Don't rely on them to tell you anything more than "It's a functional game" or "It's awful" -- maybe. Reviews, sessions, and user comments are much more useful when actually deciding on buying or trying a game.
3 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Nate Downs
United States
Granville
Ohio
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
I know, I know... late to the party...

So, why do crappy games get the benefit of this? Here are some of my crappy games, the Geek rating is the first number, the avg is the second:

Mad Libs Card Game 5.273 3.45
X-Men Trading Card 5.222 3.82

Here's the opposite:

Zooloretto Expn 2 5.759 7.34


As for reviews, they are only actually helpful if the reviewer manages to present things that they like that you have experienced. A hip=hop fan is going to like the new RZA more than a fan of country-western, generally speaking. and that hip-hop fan would rate it higher.

An even better example is all the wargamers rating their games an 11 out of 10, when I wouldn't want to touch them with a 10 foot pole... If a wargamer reviews Agricola and says "this game's is trash", but they don't tell me their favorite game took a year of chit cleaning before it was playable... I might not know to ignore the review! (I actually like all the drilling down on someone you can do here, it helps with that a lot, sometimes).

A well written session report is better because it should be a little more objective. At least that is what I have seen.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Front Page | Welcome | Contact | Privacy Policy | Terms of Service | Advertise | Support BGG | Feeds RSS
Geekdo, BoardGameGeek, the Geekdo logo, and the BoardGameGeek logo are trademarks of BoardGameGeek, LLC.