Recommend
26 
 Thumb up
 Hide
67 Posts
1 , 2 , 3  Next »   | 

BoardGameGeek» Forums » Gaming Related » General Gaming

Subject: Top 100 Analysis June 13, 2017 rss

Your Tags: Add tags
Popular Tags: [View All]
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
The House always wins.

The changing of seasons:
Prime Movers
NONE

Falling Stars
[15 games dropped 1]

Hot Lava Birth
NONE

Top Ten Trends
NO CHANGE

Top 5 Winning Movers
Arkham Horror: The Card Game
Orleans (second week!)
A Feast for Odin
Race for the Galaxy (second week!)
Five Tribes

New Highest Peaks
Arkham Horror: The Card Game #24
A Feast for Odin #40
Clank!: A Deck-Building Adventure #86

For a review/explanation of the terms in this thread, check out:
http://www.boardgamegeek.com/wiki/page/Top_100_Analysis

Strangely quiet in upper echelons. Zombicide could be bound out of the top 100 after only recently entering.

Have a good week.
8 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
Poll
Looking for some information about your thoughts about the top 100.
Which of the following best describes your opinion of the Top 100?
  Your Answer   Vote Percent Vote Count
It is a useful tool for tracking the most popular games at a given time.
60.0% 150
It is an accurate representation of the best games that have ever been produced.
3.2% 8
It is a great tool for new BGG members to find great games to get them started.
28.4% 71
It is a largely pointless exercise since there are many other tools that will help me find the games I want to play.
8.4% 21
Voters 250
If you had to compare the BGG top 100 to another listing, to which listed below would it most closely align?
  Your Answer   Vote Percent Vote Count
The IMDb list of 250 top rated movies
43.3% 103
The Billboard Hot 100
21.0% 50
The Rolling Stone list of the 100 best TV shows of all time
6.7% 16
The Peoples' Choice awards
26.1% 62
A Tumblr of great sports photography
2.9% 7
Voters 238
Other than average rating, what factors do you believe should be included in factoring the top 100 list?
  Your Answer   Vote Percent Vote Count
The number of ratings
78.9% 179
The years since release
31.3% 71
The difficulty of the game
6.6% 15
Awards and recognition
25.1% 57
The hotness (# of clicks the game is receiving)
9.7% 22
The number of reviews
9.7% 22
The positivity or negativity of the reviews (Rotten Tomatoes)
33.0% 75
The cost
2.2% 5
The playing time
4.4% 10
The game family
4.8% 11
Voters 227
This poll is now closed.   251 answers
Poll created by jmsr525
Closes: Tue Jun 20, 2017 6:00 am
5 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Bryan Thunkd
United States
Florence
MA
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
How is Five Tribes both a falling star and a top five winning moves?
3 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Alexandre Santos
Belgium
Brussels
flag msg tools
Avatar
mbmbmbmbmb
How can Five Tribes both be a winning mover and a falling star?
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Andi Hub
Germany
Frankfurt
flag msg tools
Avatar
mbmbmbmbmb
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.
5 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Jürgen K
Germany
Dortmund
flag msg tools
mbmbmbmbmb
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.

I think this is already implemented. When you look at the Top 100 you will find games with lower average and lower votings ranked above games with higher numbers. GWT and MoM is an actual example.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Joakim Schön
Sweden
Alingsås
Sverige
flag msg tools
badge
Avatar
mbmbmbmbmb
Thunkd wrote:
How is Five Tribes both a falling star and a top five winning moves?


Five Tribes is even #48 not #49.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
Thunkd wrote:
How is Five Tribes both a falling star and a top five winning moves?


Because I screwed it up.
7 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
Kudde wrote:
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.

I think this is already implemented. When you look at the Top 100 you will find games with lower average and lower votings ranked above games with higher numbers. GWT and MoM is an actual example.


Also, I should have included # of plays on the list. Live and learn.
5 
 Thumb up
0.01
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Dianne N.
United States
Seattle
Washington
flag msg tools
badge
Avatar
mbmbmbmbmb
jmsr525 wrote:
Also, I should have included # of plays on the list. Live and learn.


# of plays is tricky though. Shorter, lighter games will usually get more plays than something long and involved, though it doesn't mean the game is necessarily better or that people actually like it more. Maybe # of plays with average game time factored in? It's a tough one, but I agree that how much play a game actually sees should be factored in somehow.
3 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Frank QB
Canada
flag msg tools
Avatar
mbmbmbmbmb
jmsr525 wrote:
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?


I know you don't have to answer to our whims, but would we be able to see a mock top 10/20 with those parameters?

I suspect we'd see Settlers of Catan, Carcassonne, Ticket to Ride making big big gains.

If anything, it may swing too far the other direction.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Frank QB
Canada
flag msg tools
Avatar
mbmbmbmbmb
jmsr525 wrote:
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?


For reference, I went and looked up the way the IMDB 250 is calculated. I'm not saying it's a good or bad system, just wanted a comparable.

From IMDB:

weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C

Where:

R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250
C = the mean vote across the whole report


I wonder, then, what would happen if we just copied this with BGG comparables.
2 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Jay M.
United States
Michigan
flag msg tools
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.


Why would someone who has rated more games be rated differently than someone with less? Is fun not fun regardless of how many times you've decided to rate it?
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Zachary Homrighaus
United States
Clarendon Hills
Illinois
flag msg tools
mbmbmbmbmb
Jauron wrote:
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.


Why would someone who has rated more games be rated differently than someone with less? Is fun not fun regardless of how many times you've decided to rate it?


Seems pretty obvious that @ringo84 is trying to give more weight to people who are veteran raters. Perhaps the assumption is that they've been around, seen a few things and that their rating might take into account a broader set of experiences than someone who is rating the 3rd board game they've ever played.

Also obviously you could game that system by just rating a ton of games quickly in order to give your rating more weight. It would also tend to discourage people from starting rating games in the first place knowing their votes won't count for much.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Zachary Homrighaus
United States
Clarendon Hills
Illinois
flag msg tools
mbmbmbmbmb
I'm sure this has been discussed in the past, but what about something like this?

The top 100 games is generated by players completing an open ended ranking/rating of their top X games. There are no rules or restrictions on how you rate your games or on which criteria you base your rating... you just fill in a list of top games from X to 1 and submit. The math aggregates all the votes and produces a top 100 games.

The exact formula would still be a question, but I have to believe there is an existing standard for this sort of exercise (like how the MLB Hall of Fame voting works).

The benefit is that you don't have to worry about a rating from 1-10 and having different people mean different things for a 9. If one player rates all their games 9-10 and another adheres strictly to the BGG guidelines, there is no conflict, they each have to decide which game is simply the best and which is 2nd place and so on.

The con to this approach is that I suspect it still would favor popular games because lots of people would include them in their top X games... so a smaller, but excellent title will struggle to find footing.

Obviously this would require a whole data collection exercise, so I know it's not terrible feasible or practical, but wouldn't it produce a better list absent of many of challenges the current system has?
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Jay M.
United States
Michigan
flag msg tools
zjhomrighaus wrote:
Jauron wrote:
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.


Why would someone who has rated more games be rated differently than someone with less? Is fun not fun regardless of how many times you've decided to rate it?


Seems pretty obvious that @ringo84 is trying to give more weight to people who are veteran raters. Perhaps the assumption is that they've been around, seen a few things and that their rating might take into account a broader set of experiences than someone who is rating the 3rd board game they've ever played.

Also obviously you could game that system by just rating a ton of games quickly in order to give your rating more weight. It would also tend to discourage people from starting rating games in the first place knowing their votes won't count for much.


I don't see any value in that system, and agree doing so would just alienate participants while promoting an elitism stigma.

Board Gaming isn't a lifestyle, it's about having fun and everyone is qualified to identify fun.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Frank QB
Canada
flag msg tools
Avatar
mbmbmbmbmb
zjhomrighaus wrote:
I'm sure this has been discussed in the past, but what about something like this?

The top 100 games is generated by players completing an open ended ranking/rating of their top X games. There are no rules or restrictions on how you rate your games or on which criteria you base your rating... you just fill in a list of top games from X to 1 and submit. The math aggregates all the votes and produces a top 100 games.

The exact formula would still be a question, but I have to believe there is an existing standard for this sort of exercise (like how the MLB Hall of Fame voting works).

The benefit is that you don't have to worry about a rating from 1-10 and having different people mean different things for a 9. If one player rates all their games 9-10 and another adheres strictly to the BGG guidelines, there is no conflict, they each have to decide which game is simply the best and which is 2nd place and so on.

The con to this approach is that I suspect it still would favor popular games because lots of people would include them in their top X games... so a smaller, but excellent title will struggle to find footing.

Obviously this would require a whole data collection exercise, so I know it's not terrible feasible or practical, but wouldn't it produce a better list absent of many of challenges the current system has?


I like this idea. Even if everyone voted their top games all 10s, it'd work.
2 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Matt Brown
United States
Okemos
Michigan
flag msg tools
badge
Avatar
mbmbmbmbmb
frankqb wrote:
zjhomrighaus wrote:
I'm sure this has been discussed in the past, but what about something like this?

The top 100 games is generated by players completing an open ended ranking/rating of their top X games. There are no rules or restrictions on how you rate your games or on which criteria you base your rating... you just fill in a list of top games from X to 1 and submit. The math aggregates all the votes and produces a top 100 games.

The exact formula would still be a question, but I have to believe there is an existing standard for this sort of exercise (like how the MLB Hall of Fame voting works).

The benefit is that you don't have to worry about a rating from 1-10 and having different people mean different things for a 9. If one player rates all their games 9-10 and another adheres strictly to the BGG guidelines, there is no conflict, they each have to decide which game is simply the best and which is 2nd place and so on.

The con to this approach is that I suspect it still would favor popular games because lots of people would include them in their top X games... so a smaller, but excellent title will struggle to find footing.

Obviously this would require a whole data collection exercise, so I know it's not terrible feasible or practical, but wouldn't it produce a better list absent of many of challenges the current system has?


I like this idea. Even if everyone voted their top games all 10s, it'd work.


The one issue is finding what "x" needs to be. You could get someone here who would struggle to make a Top 50 and others who could crank out a Top 100+.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Andi Hub
Germany
Frankfurt
flag msg tools
Avatar
mbmbmbmbmb
Jauron wrote:
zjhomrighaus wrote:
Jauron wrote:
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.


Why would someone who has rated more games be rated differently than someone with less? Is fun not fun regardless of how many times you've decided to rate it?


Seems pretty obvious that @ringo84 is trying to give more weight to people who are veteran raters. Perhaps the assumption is that they've been around, seen a few things and that their rating might take into account a broader set of experiences than someone who is rating the 3rd board game they've ever played.

Also obviously you could game that system by just rating a ton of games quickly in order to give your rating more weight. It would also tend to discourage people from starting rating games in the first place knowing their votes won't count for much.


I don't see any value in that system, and agree doing so would just alienate participants while promoting an elitism stigma.

Board Gaming isn't a lifestyle, it's about having fun and everyone is qualified to identify fun.

@zjhomrighaus got it right that my intention is that veteran players get a higher weight. I do not think that this really alienates players, if you do not communicate this openly. As @Krudde mentioned above something like this or a different mechanism to detect chill votes is already in place. So the current system is different from simply adding a certain number of dummy ratings of 5 (or 5.5?) to the actual ratings and then taking the average.

frankqb wrote:
jmsr525 wrote:
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?


For reference, I went and looked up the way the IMDB 250 is calculated. I'm not saying it's a good or bad system, just wanted a comparable.

From IMDB:

weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C

Where:

R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250
C = the mean vote across the whole report


I wonder, then, what would happen if we just copied this with BGG comparables.

Basically this is in place already with the dummy votes. The number m is changing with time and C is replaced by 5 (or 5.5).

I actually like adding the log of #votes approach. Of course you have to decide on a factor for the log, basically answering the question: "which absoulte increase in the rating should a game have compared to a game that has only half the number of votes?". If this is value is an increase of 0.5 (instead of 0.693 without any factor) in average rating, the top 3 games would be:
1. Agricola
2. Power Grid
3. Twilight Struggle

This certainly is quite favouriting games that have many votes, but it is not overwhelming the average rating at all.
3 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
frankqb wrote:
jmsr525 wrote:
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?


I know you don't have to answer to our whims, but would we be able to see a mock top 10/20 with those parameters?

I suspect we'd see Settlers of Catan, Carcassonne, Ticket to Ride making big big gains.

If anything, it may swing too far the other direction.


I just quick ran the top 500 through it:

1. Puerto Rico (2002)
2. Agricola (2007)
3. Pandemic (2008)
4. 7 Wonders (2010)
5. Dominion (2008)
6. Power Grid (2004)
7. Twilight Struggle (2005)
8. Carcassonne (2000)
9. Pandemic Legacy: Season 1 (2015)
10. Catan (1995)
11. Terra Mystica (2012)
12. Ticket to Ride (2004)
13. The Castles of Burgundy (2011)
14. Codenames (2015)
15. Race for the Galaxy (2007)
16. 7 Wonders Duel (2015)
17. Lords of Waterdeep (2012)
18. Ticket to Ride: Europe (2005)
19. Small World (2009)
20. Scythe (2016)
7 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
JonMichael Rasmus
United States
Madison
Wisconsin
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
The current top ten:

Pandemic Legacy: Season 1 (2015) #9
Through the Ages: A New Story of Civilization (2015) #44
Twilight Struggle (2005) #7
Gloomhaven (2017) #63
Terra Mystica (2012) #11
Star Wars: Rebellion (2016) #48
Scythe (2016) #20
Terraforming Mars (2016) #50
7 Wonders Duel (2015) #16
Caverna: The Cave Farmers (2013) #24

I don't know. This seems like a better representation of the "hobby" than the current hotness-inflected top 100.

EDIT: M:TG at #55!
5 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Jay M.
United States
Michigan
flag msg tools
ringo84 wrote:
Jauron wrote:
zjhomrighaus wrote:
Jauron wrote:
ringo84 wrote:
For the last question I would have asked, if also "who rated" should be taken into account. Basically that different users have different weights, i.e. someone having rated 1000 games has a different weight than someone with 10 ratings or maybe even how volatile their ratings are (higher weight if you rate some games negative, some positive and not just all games 9 or 10). However, I am not sure if this would have a big impact and also I am not sure if I would like this form of discrimination.


Why would someone who has rated more games be rated differently than someone with less? Is fun not fun regardless of how many times you've decided to rate it?


Seems pretty obvious that @ringo84 is trying to give more weight to people who are veteran raters. Perhaps the assumption is that they've been around, seen a few things and that their rating might take into account a broader set of experiences than someone who is rating the 3rd board game they've ever played.

Also obviously you could game that system by just rating a ton of games quickly in order to give your rating more weight. It would also tend to discourage people from starting rating games in the first place knowing their votes won't count for much.


I don't see any value in that system, and agree doing so would just alienate participants while promoting an elitism stigma.

Board Gaming isn't a lifestyle, it's about having fun and everyone is qualified to identify fun.

@zjhomrighaus got it right that my intention is that veteran players get a higher weight. I do not think that this really alienates players, if you do not communicate this openly. As @Krudde mentioned above something like this or a different mechanism to detect chill votes is already in place. So the current system is different from simply adding a certain number of dummy ratings of 5 (or 5.5?) to the actual ratings and then taking the average.



OK, why stop there? Why not just say only certain players can vote and be done with it? If the list is meant to represent the community, then all votes should count equally.

If the masses love Monopoly, vote it up, who really cares? During my research I'll realize it's not the game for me. What am I missing here that this list has to be carefully crafted like it represents me in any reasonable way?
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Frank QB
Canada
flag msg tools
Avatar
mbmbmbmbmb
jmsr525 wrote:
frankqb wrote:
jmsr525 wrote:
I wonder if, with appropriate special sauce shillbusting, AVERAGE RATING + LN (# OF VOTES) would produce a better top 100?


I know you don't have to answer to our whims, but would we be able to see a mock top 10/20 with those parameters?

I suspect we'd see Settlers of Catan, Carcassonne, Ticket to Ride making big big gains.

If anything, it may swing too far the other direction.


I just quick ran the top 500 through it:

1. Puerto Rico (2002)
2. Agricola (2007)
3. Pandemic (2008)
4. 7 Wonders (2010)
5. Dominion (2008)
6. Power Grid (2004)
7. Twilight Struggle (2005)
8. Carcassonne (2000)
9. Pandemic Legacy: Season 1 (2015)
10. Catan (1995)
11. Terra Mystica (2012)
12. Ticket to Ride (2004)
13. The Castles of Burgundy (2011)
14. Codenames (2015)
15. Race for the Galaxy (2007)
16. 7 Wonders Duel (2015)
17. Lords of Waterdeep (2012)
18. Ticket to Ride: Europe (2005)
19. Small World (2009)
20. Scythe (2016)


Interesting. Seems less of a hotness list. Ticket to Ride and Catan definitely made their return as I thought they would while Pandemic Legacy also seems pretty well represented.

I wonder if this list is adaptable enough to highlight new games though. Would the vote threshold be both a blessing and a curse in terms of barrier to entry for new games?
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Florian Woo
Germany
Stuttgart
flag msg tools
designer
Avatar
mbmbmbmbmb
jmsr525 wrote:
I just quick ran the top 500 through it:

1. Puerto Rico (2002)
2. Agricola (2007)
3. Pandemic (2008)
4. 7 Wonders (2010)
5. Dominion (2008)
6. Power Grid (2004)
7. Twilight Struggle (2005)
8. Carcassonne (2000)
9. Pandemic Legacy: Season 1 (2015)
10. Catan (1995)
11. Terra Mystica (2012)
12. Ticket to Ride (2004)
13. The Castles of Burgundy (2011)
14. Codenames (2015)
15. Race for the Galaxy (2007)
16. 7 Wonders Duel (2015)
17. Lords of Waterdeep (2012)
18. Ticket to Ride: Europe (2005)
19. Small World (2009)
20. Scythe (2016)
This is highly interesting! Just to make sure I get the maths and am not lost in translation: you just add the logarithm of the number of votes to the average rating (not the geek rating)?
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
1 , 2 , 3  Next »   | 
Front Page | Welcome | Contact | Privacy Policy | Terms of Service | Advertise | Support BGG | Feeds RSS
Geekdo, BoardGameGeek, the Geekdo logo, and the BoardGameGeek logo are trademarks of BoardGameGeek, LLC.