Recommend
2 
 Thumb up
 Hide
29 Posts
1 , 2  Next »   | 

BoardGameGeek» Forums » Board Game Design » Board Game Design

Subject: Criteria for Critically Evaluating (critiquing) YOUR Table-top game rss

Your Tags: Add tags
Popular Tags: [View All]
marc lecours
Canada
ottawa
ontario
flag msg tools
mbmbmbmb
______Enjoy playing the game.
______Don't enjoy playing the game.
7 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
"What do you mean, I can't pay in Meeples?"
Canada
flag msg tools
mb
A lot of those choices are either ambiguous, or 'symptomatic' rather than defining an actual problem. What does a reaction of 'appropriate' mean? Good for the player's skill level/age/interest? Politically correct? Okay for kids? I don't know, and they probably wouldn't either.

For example if your play-tester found the components 'clumsy', does that mean they caused problems with partial colour blindness? were tokens too hard to pick up? The cards stuck together? It could be any of those (or something else entirely), and they're all very different issues.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Paul DeStefano
United States
Long Island
New York
flag msg tools
designer
badge
It's a Zendrum. www.zendrum.com
Avatar
mbmbmbmbmb
_____ Fun
_____ Should be fun because the numbers say so, but isn't
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
marc lecours
Canada
ottawa
ontario
flag msg tools
mbmbmbmb
You left a lot of blank places for more items. This is good. There is so much variety in games that you need flexibility.

Your next step in making a good critique check list is to test it out on a few hundred games. This is where you will see which criteria are chosen often, which criteria are not ever needed and which criteria should be added to the checklist.

It is certain that each such checklist is very geared towards the criteria that the maker of the list finds important. In other words, every person should have their own personal checklist.

A more common option for evaluating games is to just write about the things (positive and negative) that stand up about the game.

One advantage of a checklist (such as yours) is that it evaluates games in a more systematic game. Another advantage is that it allows the compiling of statistics about games (and further research about games).

One disadvantage is that is gives a very inadequate evaluation of games that are very different from the majority of games.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
"What do you mean, I can't pay in Meeples?"
Canada
flag msg tools
mb
I don't have many suggestions because I don't feel a checklist for game feedback is useful enough to be worth doing. The best feedback option is the one that leaves things open to any type of communication. If your play-testers want to submit a video of the game, let them. If they want to write a stream of consciousness thought process of them playing, read it. If they want to write you notes, take it. If they want to discuss things after playing, talk to them. If you have to go with a set of questions, I'd make them much more open ended but not provide specific responses. "Would you like A or B?" usually gets results of 'A' or 'B', but not 'C', even if 'C' is what most respondents would have picked if it had been on the list.

I suspect if you hand out a generic checkbox questionnaire that most people will simply pick what they find most appropriate and ignore giving direct feedback, if they aren't put off completely by trying to figure out what the questions are trying to ask (like I was). So in short, I'd throw this out and try again with a few open questions like 'Would you play this game again? What parts were the most fun, and the least fun? Did you run into any problems while playing?' Tailor questions to whatever you need feedback on the most.
1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Cody
United States
Placentia
Select state
flag msg tools
Avatar
mbmbmbmbmb

Implication

Are you implying something?



Expressive Factor

Uhhh...



1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
"What do you mean, I can't pay in Meeples?"
Canada
flag msg tools
mb
BooneDoggle76erz wrote:
Scythe had over 1300 play tests completed. It would be awful difficult to remember what that many people had to say, let alone interview every single one.

That's what recordings are for.

BooneDoggle76erz wrote:
A game is suppose to be played again, by someone saying "no" what valuable information are you gleaning from them?

That's why open ended questions are better than yes/no or multiple choice. What useful information would you get? You'd know to read the rest of that play-tester's survey from the viewpoint of someone who did not enjoy the game.

BooneDoggle76erz wrote:
The ability to preform analysis and compile rapid amounts of data then be able to graph the data in excel for later comparison and quick reference is priceless.

Also worthless if you end up with a data set that's invalidated by obfuscated evaluation criteria or so generalized as to become meaningless.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
David Goh
Singapore
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
What a detailed checklist!

In this case, I agree with Fire_Forever's answer. As a playtester myself, I generally prefer responding to open-ended questions. That's because I tend to work off my initial first impressions, and then slowly become more in-depth and critical either through further discussion with the designer, or through introspection.

Because of this, I find checklists to be quite tiring and limiting to work through (especially long ones like these). Granted, I think it's good to have a checklist like this if you have a large number of playtesters trying out the game remotely, and you just want raw data... but otherwise, I'd prefer a more balanced mix of multiple choice questions and open-ended ones.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
"What do you mean, I can't pay in Meeples?"
Canada
flag msg tools
mb
What exactly is the end-goal? What problem are you trying to solve?
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
David Goh
Singapore
flag msg tools
designer
badge
Avatar
mbmbmbmbmb
BooneDoggle76erz wrote:
In other words, what method is better than this, if this is not valid?



I wouldn't say that this method isn't valid... I think it all just comes down to qualitative vs quantitative analysis. There's value in both, and which a designer finds more useful depends on what their process is like. The most balanced approach would be to pursue both equally, but I find that indie designers tend to go one way or the other.

As a playtester, my preference would be to have something that's closer to an open-ended discussion (qualitative), rather to fill up a lot of multiple-choice questions (quantitative). In the case of your checklist, I feel that some questions are better presented as open-ended because of how ambiguous they seem, such as "Mechanics", "Implication" and "Overall reaction." As for categorizing data, I feel like that's up to the game designer to provide direction on, because everyone has different metrics and research goals.

As a designer myself, wherever possible I'll try to have an open-ended discussion with my testers. I'd also adjust the direction of what's discussed based on factors such as:

- The number of games played (for example, I'd be less concerned about discussing balance if only a few games were played, and more concerned on fun and replayability)
- Tester's profession/field of expertise (if I'm testing with writers, I'll focus on unearthing copywriting issues. if I'm testing with graphic designers, I'll focus on getting feedback on visuals if the game's been themed)
- Tester's preferences on board games (Whether or not they're casual or hardcore players, what they feel about conflict in games or randomness etc.)
- My own goals for the specific playtest. (for example, for this session I introduced a new set of cards, so in particular I want to know how they're received)

I also do engage in quantitative data collection, but that's usually for balance purposes. For Endogenesis (a game I'm working on), I take note of the win rates based on factors such as turn order (useful to seeing if there's a first turn advantage for example), cards used and the end state of the winning player.
2 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Ben Bateson
United Kingdom
Ross-on-Wye
flag msg tools
badge
Oi! Hands off...
Avatar
mbmbmbmbmb
90% of this list is irrelevant to games design. You can re-skin, re-theme or change the general appearance of a game in a second. This is often done by publishers as a matter of course anyway. If a game was delivered to a publishers with terrible mechanics and no theme but a Star Wars franchise approval taped to the box lid, then they'd publish it within the month.

You can't sum up the experience and mechanics of a game in a few pithy catchphrases. And even if a playtester thought they could, I've no idea what use their feedback would be.




 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Ben Bateson
United Kingdom
Ross-on-Wye
flag msg tools
badge
Oi! Hands off...
Avatar
mbmbmbmbmb
BooneDoggle76erz wrote:

Ben, what process do you use for analysis of the games you're having playtested?


We play the games and tell the designer what we think. We make suggestions and discuss how feasible they are. That's all. If the designer gave us a twelve-page questionnaire to fill in with every playtest then he'd never get any playtesting done.
2 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Ben Bateson
United Kingdom
Ross-on-Wye
flag msg tools
badge
Oi! Hands off...
Avatar
mbmbmbmbmb
BooneDoggle76erz wrote:
how do you quantify 1300 play tests?


You say 'this game has been playtested 1300 times'. Lots of playtesting does not automatically mean a better game. It just means the designer has access to more playtesters.

Speaking as a quality professional, I see this as somewhat analogous to the debate over QC/QA. Quality Control is traditional inspection of product, fault-finding, measurement and so on. This is pretty analogous to playtesting. Playtesting, in my experience, is what modern designers do to make sure their games aren't buggy or breakable.

However, if you want to make a quality product, then you practice Quality Assurance. This is building in the necessary controls and practices from the very start so that a good game will not get as far as Quality Control. There are plenty of good examples in the market now. Spot It, for example, is based on such an ingenious bit of maths that it almost certainly needed minimal playtesting. Similar things probably apply to a lot of abstracts and quite a few Knizia games.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Ben Bateson
United Kingdom
Ross-on-Wye
flag msg tools
badge
Oi! Hands off...
Avatar
mbmbmbmbmb
BooneDoggle76erz wrote:
Should we then reason that why so many bad games exist - is due to the abundance of bad play testers?


That's akin to saying that your house fell down because your insurance company were incompetent.
 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Carel Teijgeler
Netherlands
Vlaardingen
flag msg tools
Avatar
mbmbmbmbmb
Do you really have to have multiple replies in a sequence when all could have been done in one.

There is an Edit button at your disposal.

 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
Ben Bateson
United Kingdom
Ross-on-Wye
flag msg tools
badge
Oi! Hands off...
Avatar
mbmbmbmbmb
BooneDoggle76erz wrote:

A play tester's feedback is nothing more than a customer who leaves with a complaint or a provides a suggestion and puts it in the little box or makes a request for the business to provide a particular service. A play tester isn't going to sit in the round table discussion about game design decisions nor should he be there.


This is 100% not my experience of playtesting

Quote:
...and you know better than anyone that when you're going over quality control, you have a checklist, a set of guideline to abide by


Yes, but those specifications are provided by the people who designed the product in the first place. The analogy you're looking for are game rules.

Quote:
by saying that 1300 play tests don't make a better game, don't you negate the purpose of the tests in the first place:


Yes. That was exactly my point. Some games don't need 1300 play tests. The fact that Scythe did doesn't make it any more an impressive game, in my view.

Quote:
IF a person wants to print 500 copies of a game then be done with it, then you don't need play testers and data. But if you want to have a game in the BGG top 10 you're going to have to step up the game.


You're making the faulty assumptions that a) the BGG top 10 is a significant goal for designers and that b) all games in the BGG top 10 have been playtested to your (imagined) standards.

Quote:
I play tested a game last year through three versions, I don't believe the game ever found a publisher nor will it, despite everyone at the table saying they would replay it and having mostly goods things to say about it.


I interpret this as 'publishers have much more sway over a game's marketability than playtesters'. You are completely correct. Improving playtesting documentation won't change this.

Quote:
The designer just hit a road block and couldn't get the playtime reduced. That feedback wasn't given to him from his testers, it was fundamentally a bad design and the data showed.


So, you're saying that the playtesters were no good? And you were one of the playtesters? And you want to tell other playtesters how to do it??????

1 
 Thumb up
 tip
 Hide
  • [+] Dice rolls
1 , 2  Next »   | 
Front Page | Welcome | Contact | Privacy Policy | Terms of Service | Advertise | Support BGG | Feeds RSS
Geekdo, BoardGameGeek, the Geekdo logo, and the BoardGameGeek logo are trademarks of BoardGameGeek, LLC.