1 , 2 , 3 , 4 , 5 Next » 
Several years ago I was reading a column in PC gamer written by Desslock, who has been writing about computer RPGs for many years. He likes skill-based development systems because players improve in the capabilities that they use rather than allocate experience points to whatever improvements they choose, perhaps being required to “train” in those new abilities. To him it makes much more sense that you improve in the things you actually do than those you train for.
Bear with me a while here as I veer into teaching and then back to RPGs. I agree, though as a teacher I recognize that a good teacher can convey their experience to enable someone to avoid the lessons of the “school of hard knocks”. I also recognize that it’s possible for someone to do something over and over but to do it poorly in a way that does not lead to improvement. But as I read I realized that, in the United States at least, a great many people believe that training is the best way or the only way to know how to do something. I remember one 18-year-old student telling me a few years ago that he and his classmates had been taught in high school that the only way to learn how to do something is to take class for it! This was a student in a game design class; OTOH I certainly never had the opportunity to take any game design classes but I do pretty well at it and know quite a bit about it.
Yet I see this attitude that classes are the only way to learn, institutionalized in our schools and colleges. The accreditation agencies that a accredit typical public and private colleges and universities in this country emphasize degrees as the major criterion of qualification for teachers. It does not matter if you have been teaching the subject for 30 years: that is explicitly disregarded. I was told about someone who had taught a subject for 32 years in a local high school and received a letter from the state telling him he was not qualified to teach it because he did not have a degree in that area. (Yet at the same time, in the same state, a large proportion of K12 teachers have no qualifications including no teaching certificate. These are lateral entry people who are allowed to teach up to three years before they need to get the teaching certificate.)
It does not matter, unless the school is willing to go through a lengthy portfolio process, that you (for example) worked in networking at a major medical center more than nine years before teaching networking classes. If you don’t have a networking degree you are not qualified to teach networking, even though networking degrees did not exist until about 15 years ago and consequently anybody who went to school before that could not possibly have a networking degree. (These are actual experiences, not theoretical.) One college president told me that a person with a PhD in zoology was deemed by the accreditation people to be not qualified to teach freshman biology - zoology and botany are the two major divisions of biology - and as a result the school terminated the teacher! If this had been anticipated, or the school had been willing to disagree and create a portfolio for the instructor, he almost certainly would have been deemed qualified. But schools are very rarely willing to go to this trouble.
So we get a situation where, for example, the founder of creative writing as a curriculum in universities later said it should be done away with. The major reason for this is that the people who have actually published novels and other kinds of creative writing that people pay money for, do not usually have Masters or PhD degrees in creative writing and so are “not qualified” to teach creative writing. The people who are officially qualified to teach creative writing have gone through creative writing programs but may not have had anything published commercially.
What we tend to get in colleges and universities for teachers is people who have gone through undergraduate school and then graduate school and have a Masters or PhD in their subject, but have never actually practiced it in the real world. For some subjects there is no way to practice it in the real world but others are very much practice based.
Given how this point of view has permeated schools, colleges, and universities, should we be surprised if role-playing games take the same sort of path? I always thought one of the dumbest rules in early versions of D&D was the requirement that when you reached enough experience points to rising level you had to pay somebody an exorbitant sum to “train” you to be able act at the new level. It was dumb from a gameplay point of view, because if applied as written it turned adventurers into money grubbers in order to acquire enough money for training. It was also dumb because if you have done the things that enabled you to survive and prosper then why would you need somebody to train you? (And we can ask the chicken and egg question, where did the original trainer come from? There must be a way to learn these things successfully without being trained.)
Computers are ideal for skill-based development because the computer can keep track of what you did and raise your capability as you go along. This is much more difficult to track in tabletop RPG’s.
(This was originally a response to a question on Quora.)
Because there are so many kinds of game designers, the answer to the question is the same as the answer to many questions about game design: it depends.
The difference in experience between being a game designer who is working full time for a video game studio, someone who is an indie video game designer, and someone who is a freelance tabletop game designer such as myself, is immense. (There are a few full-time tabletop designers working for publishers, as well.)
For example, almost all of the time I can design whatever kind of game I want to design, and either I find someone to publish it, or I self publish it (which I personally do not do, but most tabletop designers do these days), or it doesn’t get published. Someone who is working as a game designer full-time may be lucky enough to work on a game they want to do, but much more likely will be working on a game that someone else decided is the one the studio needs to do. Indie video game designers tend to fall more into the freelance category in this respect, they’re on their own.
Video game designers tend to work on one game at a time, the one they’re trying to prepare to be published, while experienced tabletop designers tend to work on a lot of games in a given segment of time. The difference comes from how long it takes to get a game to a decent prototype. There is no programming or art or sound required for a tabletop game, so you can get to a good prototype relatively quickly, compared with a videogame. And from the good prototype to the final takes far longer for a video game than for a tabletop - the publisher takes care of production for the tabletop. That is, if the designer has licensed to a publisher, rather than self publishes.
Tabletop designers often spend a great deal more time involved in playtesting, than video game designers do. Much of that is because video games are designed to be played right out of the box, whereas someone has to read the rules of the tabletop game. And of course you can make as many copies as you want of a digital game at no cost, to send to playtesters. So it’s relatively easy for a video game studio to send their game out for “blind” playtesting (testing where the players have no knowledge of the development of the game). Tabletop designers spend much more time overseeing face-to-face playtesting of their games than they do actually designing them.
Video games can also go into “Early Access” or some other kind of pre-release and even post-release testing that is not possible for tabletop games.
Employment conditions in video game studios vary immensely. What Chris Crawford said 15 years ago is still true today, there are so many people who want jobs in the video game industry that the employers have supply and demand on their side; in that situation, employees are often treated poorly.
My game Doomstar, in video form, is now listed on Steam and will be available in mid-September. http://store.steampowered.com/app/504750/ https://largevisiblemachine.itch.io/doomstar
The Beta is available in some inexpensive bundles (which I thought were piracy, but are not!).
Following is the text of the slides.
Game Design: “Don’t Make Me Think”?
Dr. Lewis Pulsipher
“Game Design” channel on YouTube
Thanks to Steve Krug
There's a well-known book about website usability by Steve Krug titled "Don't Make Me Think“
He means, don’t make people have to think to find things on websites
Most games involve some thinking, but we can adopt this motto to mean, don't make players think about anything other than the gameplay of the game
Unfortunately, it can also mean, not thinking at all, and that attitude is becoming more common in hobby gaming
It’s always been the norm in party and family games
So we have three meanings
First is wholly desirable: don’t make people think about the interface and how they manipulate the game
Manipulating the game should be second nature
Second is “the way it is”, though I don’t like it: don’t make people have to think when they play a game
This is already rampant in video games, with many being “athleticware” rather than “brainware”
Athleticism, physical skills, dominate in athleticware
Many F2P games have become reward-based rather than consequence-based
Third is what I first referred to, make people think only about gameplay
1) Don’t make me think about the Interface
The interface is how the player tells the game what he/she does, and the game tells the player what happens
Manipulating the game should be second nature from very early on, players should not need to struggle with the interface
Players of tabletop games shouldn't have to remember "every third turn" (and the computer should take care of that for video games)
Players shouldn’t have to do arithmetic unless it's necessary to good gameplay. Players shouldn't have to remember odd aspects of victory conditions unless it's necessary to the game
This is why combat lookup tables are frowned upon nowadays, players have to think about something not usually necessary to gameplay
Innovation is often praised in games, but innovation in the interface is dangerous
Because familiarity, not newness, makes interfaces easier
“Intuitive,” when used in conjunction with UI, usually means “familiar”
2) Don’t make people think when they play a game
Many people want to be entertained when they play a game, they don’t want to put in an effort
They are passive rather than active
It’s like watching a tentpole action movie such as “Avengers”
I really like Avengers, but games are different from movies, to me
But to many people, these days, they are not much different
They want to be given things, rewards, not to earn anything
This attitude used to be confined to party and family games, but is now common in hobby games
There are lots of ways to do this:
Reduce the number of plausible choices for each decision
Reduce the number of decisions
Provide catch-up mechanisms such that people can mostly not pay attention for much of the game, and still have a chance to win
Make it obvious (“transparent”) how you need to play to win
Provide well-signposted “paths to victory”
Dexterity games – combining athleticware with brainware
You don’t have to think (or, not nearly as much)
Old codgers and naturally slow folk like me aren’t fans
I used to play Total Annihilation on dead slow in order to enjoy it
That turned it into a thinking game, not a reaction speed game
For example the British card game Snap (standard deck)
The failed collectible game Clout Fantasy involved dexterity
Pitch Car (racing), caroms, lots of other dexterity games
And of course, a great many video games
3) Don’t make me think about anything other than gameplay
Another way to put it: don’t put anything into the game that isn’t important to the strategies of play
A worthy goal, in my estimation
This is important to people who want to earn what they get from a game, rather than be rewarded for participation
This comes back to my motto, "A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." (Antoine de Saint-Exupery)
The other side: “I feel stupid”
The other side of this is whether the game makes the player feel stupid (Jeffro)
That’s OK for “old-time” gamers, for chess players and the like
If they make a mistake, they recognize it and try to do better next time
It’s not OK for people who are waiting to be entertained. They don’t want to feel uncomfortable
It’s the Age of Comfort, after all
People are brought up to avoid any kind of pain or discomfort – to their detriment
Of course, there are lots of gamers somewhere between these extremes
In general, the broader the appeal of your game, the less you can make people think.
This is a three-year-old screencast from my course "Learning Game Design, Part 1"
[I've since addressed this again in "The Futility of Striving for a Great Innovation" https://www.youtube.com/watch?v=902jtgyYtwI
Sooner or later I'll finish one about Surprise in Games, because it's really surprise that players want, not innovation.]
Here is the text of the slides, though there is more in the presentation, of course:
How often is “innovation” fun?
The “cult of the new” is very strong in this century
But how does innovation contribute to enjoyment in a game? Mainly by “surprise”
Yet something that’s not innovative to an “expert”, is to a novice
Most people play games to enjoy them, and innovation isn’t important to that
Think of all the video game sequels that sell so very well
Recent check of “most anticipated” game list in PC Gamer showed 12 out of 13 were sequels
One man’s innovation is another’s ho hum/old hat
Tim Sweeney (Epic Games founder) in Gamasutra Interview 2009:
“That's kind of a common pattern in everything I do. One minute I'm completely on my own and I think, "Wow, I'm a genius, I can't believe this idea nobody else had!" And then you look at the references on it, and it turns out that a hundred other people have done the same things in the 1980s. And then you look, and you get your additional ideas from those. Between invention and stealing, you come up with a really good combination of ideas.”
Combinations and Models
Good combinations may not be purely innovative, but are often brand new even though each element is not
Further, many games are models of some reality. Then a good model is what makes a game good, not innovative mechanics or other elements
Make good combinations, make good models
Learning Game Design, Part I: https://www.udemy.com/draft/786564/?couponCode=1LGD39
Three separate topics in one post
Game Design: Not much to Show about the Process
As you may know I make hundreds of screencasts about game design, many of them in courses at https://www.udemy.com/user/drlewispulsipher/ (discounts at pulsiphergames.com). And I've written a book about game design as in my book http://www.amazon.com/Game-Design-Create-Tabletop-Finish/dp/...;qid=1459717843&sr=8-1&keywords=lewis+pulsipher
But game design goes on in the mind. There is little to show. I can show someone making a map for a game, I can show people talking about game design, I can talk about game design, I can show you a game being playtested: but I cannot show you game design, because it's internal, not external.
I think many people don't quite understand that.
Yet recently I’ve been made aware of the gamedev threat on Twitch TV, live streaming. Most of the streaming shows someone playing video games, often watched by thousands. Gamedev shows game developers crunching their code while occasionally answering questions from the viewers. Still not much to show, but it works for several dozen viewers! Yet it’s about programming much more than about game design.
Chinese history in a (cyclical) nutshell:
1. Anarchy reins, famine widespread, population plummets
2. Fairly stable "nations" (often called dynasties) established
3. Sooner or later, someone unites the land and becomes emperor (by Mandate of Heaven, of course); the land prospers
4. Population becomes too high for current agricultural technology, banditry erupts, anarchy reins, famine widespread, emperor/dynasty overthrown, population plummets - that is, back to 1.
Though it must be said, sometimes external invaders come into it, though usually the invaders succeed in bad times and fail in good times. Sometimes one dynasty was immediately succeeded by another, sometimes a period of warring states intervenes.
Hiring an F2P Game Designer
How to hire [F2P] Game Designers [for a small studio], by Ilya Eremeyev. http://gamasutra.com/blogs/IlyaEremeyev/20150720/248965/How_...
Detailed, some interesting points of view both in the kinds of designers, and in the crap he has to wade through.
I don’t know his company or even country, though the name is Slavic and English is not his first language.
Game designers are basically divided into 2 types: Game designers - storytellers and game designers-mathematics.
The first ones see their role in developing a “feeling”, writing a plot, quests, items descriptions and game universe backstory.
Second ones are all about balance design, economics, gameplay formulas and calculations.
In all conscience, most designers unite those skills but usually they focus on the one side more than on the other.
Usually, I hunt people with math\programming background and surely add this condition to our vacancy, which helps immediately cut off a half of unsuitable candidates and save some time.
After receiving “CVs”:
"First of all, I isolate infants, crazy dreamers, too juniors without any experience and strange guys who send me messages like ”Yo, wanna work in your company, I have a lot of ideas, but won’t share them with you, mail me dude” and instead of useful skills they point their love of anime and coffee."
[The noobs who haven’t figured out that game design does not equal ideas.]
[He gives them a seven- part test (questions but not answers included in the blog post) involving probability and knowledge of F2P games.]
"It is very important to catch a free-to-play haters, ideological pirates and peoples who stuck in the past. To understand how to make free-to-play games it is necessary to play and pay by yourself."
[As a college teacher, my colleagues and I were told by administration that we should not discuss a student (current or former) with any prospective employer unless we had only good things to say. Colleges nowadays are afraid of being sued. This fear-of-litigation tends to reduce the value of references and recommendations. Here's the author's take:]
"Recommendations are a very useful tool in hiring, which unfortunately [are] often ignored. It is great if a candidate can provide a few contacts of his previous employers but if not I do not hesitate to contact them by myself and get a feedback about candidate’s performance."
Tue Jun 28, 2016 12:32 am
I had an unusual experience for a tabletop game designer at the UK game Expo. I watched an unpublished prototype that I had designed being played long after I had "finished" it. (And about 35 years after I started it.) This viewing in itself is not an unusual experience, because I sometimes leave a prototype for years and then come back to it. But in this case, I didn't watch just one play, I watched more than half a dozen. The game is a tactical space wargame that I call Doomstar, which is being converted to video. The idea is that the video version may help convince someone to publish the physical version. (Making money with a small video game is very much a matter of chance.)
(I'm happy to say that I couldn't find any fault with the game, either, and that's unusual for me. I usually have questions/doubts.)
The game is vaguely reminiscent of L'Attaque/Stratego, but immensely more fluid and less hierarchical, quite a different (and I think, better and much shorter) experience than Stratego.
If you want to sign up for the Doomstar PC beta, go to www.doomstargame.com/beta
While I haven’t consciously adopted a "philosophy" of design, these are my observations of what happens.
I design two kinds of games. One kind is strategic, I hope with some gameplay depth, and highly interactive (like Britannia or Dragon Rage). The other kind is "screwage" games (where you "mess with" the other players) that are not very strategic, but provide a fair bit of interaction within the game (though this varies)(like Sea Kings.). I am no longer a big fan of two player games, and even when I design something that is two player, I try to provide for partners play.
The strategic games tend to be 2-3 hours long, the screwage games under an hour. Some of those games can be played quite quickly, but I don't design games that are intended to be less than half an hour, because such are bagatelles ("a thing regarded as too unimportant or easy to be worth much consideration".).
I prefer fairly simple-to-play games over fiddly, rules-complex, or many-pieces games. I avoid any deliberately-added complexity. My motto is "A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." - Antoine de Saint-Exupery. Another form, about Japanese gardening actually, is "Your garden is not complete until there is nothing else that you can remove."
I dislike puzzles, things that have always-correct solutions. Like most Eurogames (not all) and most single-player video games (not all).
I try to keep the number of pieces, cards, and other Stuff a player controls to under two dozen in multi-player (more than two players) games, and under three dozen in two-player games. Strategic and tactical complexity can be achieved without large numbers of elements to keep track of.
On the other hand, I am not of the "I only want a few alternative choices" school. In other words, a chess-like element is often in my games (lots of different possibilities, though few pieces). I have simplified some of my games (generally the screwage card games) to the point that only a few practical moves need to be considered each turn.
Part of the "chess-like" method is that you must watch every move your opponent(s) make, and react to it. (Though the reaction can be to ignore it, you have to figure that out, not ignore it to begin with.) If you don't "watch like a hawk", you are very likely to lose.
I try to reduce the effects of chance in strategic games, either with lots of die rolls that will tend to "even out" (Britannia) or by using elements such as cards that players have some control over. The more cards are used in a board game, the less dice should play a part.
I design games, not simulations. I prefer players to have control of their pieces; perhaps they can’t do everything that they want to (that’s good, it makes them make choices), but there is no random element that prevents them from doing something with their pieces, however realistic that may be. In the end, it’s a game. Hence, I have little interest in many of the more recent additions to wargames that model the huge uncertainty of warfare such as "chit pulls", activations, and "card driven".
I like the game to represent something, to be a model. In many Euro-style games, the atmosphere (often wrongly called theme in this case) is "tacked on" (and could be changed considerably), and the players are entirely concerned with pure mechanics and with the other players. I like to be able to understand that when I move something in the game, or do something in the game, it’s something like an event that could happen in reality. And when something happens in a game, it could have happened in reality. Having been educated in history, I am far more skeptical than most about relating real-world events to game events.
An historical game can teach the players something about history. I am not, however, of the "what if" school of varying one factor or one decision to "see what would have happened". My games tend to be at a high (strategic) level where it is practically impossible to "model" the factors that produced history, so it is rarely practical to use the "what if" query.
So I model effects, not causes.
Despite all that, I do design the occasional fairly abstract game, because abstract games are a form of "pure" game. What I rarely do is design an abstract game but pretend it's something else.
People play games for many reasons. I play either (in cooperative games such as D&D) to "succeed in the mission" and keep everyone on my side "whole", or (in competitive games) to win the game. I like to know the rules of a game thoroughly; I much prefer to read a set of rules rather than have someone teach me, probably because I want to thoroughly know what’s going on. I recognize that the rules-reading preference, in particular, is a minority view! Nonetheless, I tend to design games that I like to play, though I design games for other people, not for myself.
(Originally appeared in my "expert Blog" on Gamasutra.)
You may have heard me in the past talk about the widespread displacement of consequence-based gaming by reward-based gaming. Party games, and to a lesser extent family games, have always been reward-based (you're rewarded for participation) rather than consequence-based (winning and losing is important, plus more), but hobby games were usually the latter. The change in hobby games started in the videogame world, where most single player games are puzzles rather than opposed games, and so as long as you are persistent - especially when you can use the video save games to try different things - sooner or later you'll solve the puzzle.
Puzzles have always been with us, and truth to tell, puzzles are more popular than games with the population as a whole.
But the move to reward-basis is far stronger now. Subscription games (MMOs) and now Free to Play games have been the real turning points, because the player must constantly be enticed to stay in the game long enough to begin spending money in the various ways that games extract/entice money from players, other than purchasing the game. So players are constantly rewarded, and practically all the consequences of their actions are good for them. Some players go so far as to blame the game if the player does not succeed.
I have maintained that if there are no consequences to your actions, you don't have a game, you have a playground, a toy. And in a typical video game with its save game capability, how can there ever be any consequences to your actions, because you can always go back to your save game and try again?
Tabletop games have always had consequences when you were playing with other people, because you can't go back and try again, you have to accept what happens, and that often involves losing the game. I think we're starting to get away from that now in some tabletop games, which are more reward-based than consequence-based.
I was recently at the East Coast Game Conference in Raleigh North Carolina, where the keynote speaker was Warren Spector, designer of Deus Ex, Epic Mickey, and other games.
Most video games have right and wrong choices, with the right one(s) leading to the planned ending (or several endings). As Spector pointed out, they tend to be black and white, right and wrong. Warren Spector wants player choices in (video) games to have consequences, but does not want the choices to be right or wrong, black or white. That's the difference between what he does, and a puzzle, where the right choice leads toward the always-correct solution. He wants to ask questions of the players and have the players grapple with possible answers, but he definitely doesn't want to answer those questions for the players. These questions are sometimes profound, as in what does it mean to be human (as opposed to a cyborg, robot, or alien).
Moreover, Spector wants the choices players select to make a difference in the outcomes of the game. There are great many video games where you can make different choices but in the end the consequences are the same, including many branching games because the branches ultimately go back to a single node regardless of which choice you made.
Of course, *good* tabletop games always have consequences to the player choices. It's built into the form with human opposition. These are consequences not only in success and failure, but in the outcomes of the game. For example, even though some people believe that my historical game Britannia is a heavily scripted game, you don't see two games go exactly the same way. Each player choice makes a difference in the outcome, and there are millions of possible outcomes.
At one point Spector asked the audience if any of them had noticed that the big splash screen at the end of one of the Mickey games was created based on all the decisions the player had made throughout the game, so that there were thousands of different possibilities. Then he wistfully answered his own question by saying probably no one had noticed.
Someone beat me to it and asked how a game can have consequences when it has save-games. Spector said he has no answer (though he had obviously been asked many times before), and that it's most unlikely that many games will be sold without saves (other than Rogue-likes). He did say that with one of his games (I think Deus Ex but it could've been Epic Mickey) he expected players to take one or another of the choices presented to them and run with it. Instead players would try each possibility and save the result, and when they had tried everything then they took the result they liked the most and went on from there. This is the epitome of lack of consequence. Yet, he said the player has paid their money and they can do what they want with the game. (In free to play games, then, how do you address this form of activity?)
Spector mentioned that in another of his games he allowed players to switch at will from one line of choices to another (I cannot recall whether it was character class or something else). And this had ruined the game, because it removed the consequences of so many choices.
In effect Spector was talking about an idealized form of a video game, rather than the form that's actually played by most game players these days, which is the save-and-try-again-until-you-like-it method. By and large I prefer the practical to the ideal in game design; fortunately, you can design a video game with Spector-style consequences, and that will work both for those who do the save-game tactics, and those who don't.
Consequences are a form of constraints, and contemporary players do not like constraints. They want to do whatever they want to do, as though they were on a playground or playing with toys. We've seen this occasionally for many decades, as it showed up early in Dungeons & Dragons. For example, character alignment was a form of constraint, and a great many players railed against alignment because it prevented them from doing whatever they wanted to do, from being what I call Chaotic Neutral Thugs, from behaving like they were in their own private playground, But now the attitude is much stronger, and there are many video games that pander to it in the name of retention (so that the player will spend money).
Games are inherently a bundle of constraints. But we can design on a spectrum from strong constraints (where there are consequences to player actions) to ones with weak constraints (players rewarded for participation).
Tabletop games used to have a tradition of open games, where you could play in whatever playstyle you wanted. That's been undermined by puzzles, where you have to conform to the always-correct solution. I call the puzzle-games, epitomized by very many Euro games, and most single-player videos, "closed games". Spector is recommending that developers make open games, not closed ones.
As do I. Unfortunately, closed games seem to be what the large majority of players want. And closed games are easier to design.
Wed May 11, 2016 10:25 pm
There are two fundamental ways to approach design of games (and of RPG adventures).
One approach is to set up an interesting situation and let the players cope with it as best they can, "write their own story", and in this case each group of players is likely to write quite a different story from the same situation.
The other approach is to establish a linear course, a story, for the players to follow. The designer writes the story, not the players. Each play of the game/adventure follows roughly the same course.
And there's everything in between those extremes. But it's a spectrum from one extreme to the other. Most designers are some of one and some of the other.
Whether you call this "rules emergent vs progressive", or "open vs closed" or "sandbox vs linear" or "ludology vs narratolody", or something else, it amounts, in every case, to "how much does the creator want/try to control what the players do?"
I suspect many who haven't actually created games and adventures don't quite see how clear the choice is, in the end.
I am very much of the "let them write their own story" side. For me, a designer gives players the tools to enjoy themselves, doesn't impose upon them. ("Are you a Game Designer or a Fiction Writer?" http://youtu.be/Gl9EMszhYNo ) But many take the other extreme, or something close to it.
Mon Apr 25, 2016 11:28 pm
Why aren't computer RPGs (especially MMOs)
as much FUN to play as old-time D&D?
Lewis Pulsipher (Originally written Oct 2009)
[This was originally completed in October 2009, but for various reasons has not seen the light of publication. Generally it still applies, but occasionally I’ll interject some comments in brackets from the perspective of 2016.]
Oh, but they ARE as much fun, you say? Yet I don't see much evidence of that. For so many people it seems like a lot of work especially in MMOs - "the grind" - aimed at rising in level. People don't enjoy the journey, they only enjoy the destination ("I'm 80th level!"). That's why there's a big market for sale of items and gold and even entire accounts for such games, the market addressed by "pharming". (More details later.)
How did this happen? We can observe that, in hard core video games in general, this "ennui" seems to be a problem (ennui: "a feeling of utter weariness and discontent resulting from satiety or lack of interest; boredom"). The journey isn't much fun. People brag that "I beat the game," often throwing in an impressively-short duration of play, or that "I made maximum level", but they don't appear to have enjoyed it. How many of the hard core say "did you enjoy playing?", instead they say "how long did it take you to beat the game?" They want the result, not the experience. It's as though a ten year old who wants to be wealthy when he's 60 would be happy to jump from 10 to wealthy 60 without experiencing the years in between.
Focus on “Leveling up” and lack of Group Play
Where games involve character levels, there are two possible reasons why this has happened. I played First Edition AD&D for 30 years starting in 1975; my highest level character made 14th, but the last two levels were from magic items and he never actually played higher than 12th, which is just as well because the game doesn’t handle 14th level at all well. Most of my many characters didn't make double figures of levels. It took a LONG time, many long adventures involving several people, to "level up". I recall one character that took ten adventures to reach second level. So of course, I played the game not to level up, but to enjoy the adventure - as we all did. (I can even remember discovering that a character had risen a level, but I hadn’t noticed because I’d not tallied the experience points from the past several adventures. “Leveling up” was not the objective.)
I knew a former WoW pharmer who said he could reliably go from 1st to 30th level in 16 hours. Nowadays in video games, it's quite easy to rise in level, and not surprisingly the objective of many players becomes rising in level rather than enjoying adventures. How many players say "I really enjoyed that game;" instead they say, "I made 80th level".
Perhaps much of the reason for this change in objective, and consequent change in enjoyment, is the solitary nature of MMOs and computer RPGs (something that has ended for folks who join guilds and participate in big raids). Face-to-face D&D is a social game, one that you enjoy with friends (or people who become your friends), one where much enjoyment is taken from the talk and activity between (and often during) the actual adventures, as well as from the adventures. This is only now starting to become common in MMOs and online RPGs. In times past, people playing alone didn't have other people to share their adventures with, to commiserate with, to recount old events. Lacking that, what could they do? Concentrate on "leveling up".
Too Much Like Work
But even in online games we find people doing more and more that seems like work. Nick Yee, then of Stanford University, wrote a journal article called "The Labor of Fun: How Video Games Blur the Boundaries of Work and Play" published in 2006. He used data from over 35,000 surveys completed by MMO players. From the abstract:
Video games . . . transformation into work platforms and the staggering amount of work that is being done in these games often go unnoticed. Users spend on average 20 hours a week in online games, and many of them describe their game play as obligation, tedium, and more like a second job than entertainment. Using well-known behavior conditioning principles, video games are inherently work platforms that train us to become better game workers. And the work that is being performed in video games is increasingly similar to the work performed in business corporations. (Google "Nick Yee Labor of Fun" for a PDF of the article.)
Some of this “work ethic” may be because players pay to play the game, so they feel obligated to play even if they don’t enjoy it. But that’s a minor factor, as those who really don’t enjoy it will quit.
[Far fewer games are paid for these days, rather they’re free-to-play (F2P). Though many who play long enough to reach “max level” will still be spending money.]
Even when many people participate together, the experience of actually playing the game is rarely social. Listen to accounts of the big raids in MMORPGs. Every person is assigned a task (DPS ["damage per second"], healer, etc.); must do that task with precise timing; and does nothing else. Each person's experience is uni-dimensional, a cog in a machine rather than an independent actor. If a few people mess up their timing or role, the whole raid can fail. Because of the time pressure, there's no opportunity to think, to use strategy, or to enjoy what's happening once the raid starts.
Does that sound like fun? Contrast this with old D&D played at a leisurely pace, with lots of time to think and enjoy what's happening, where every character could act independently while keeping the good of the group as a whole in mind. [I suppose the key is the difference between “brainware”, using your brain to succeed in tabletop games, and “athleticware”, using your physical prowess to succeed in video games. There’s a lot more potential stress in athleticware.]
The "play" has become work to too many people. I remember talking with someone who was a major officer in a fantasy MMO guild for many months. He finally realized that it was work, that he wasn't enjoying it, that people treated him badly if he didn't do exactly what they wanted, or if the raids weren't successful. So he quit. There are similar examples in Yee's paper.
No Fear of Death
The other reason for the change in focus involves character death. In First Edition AD&D you actually feared character death. If you died, it hurt your constitution or your experience points, or both; at worst, you were dead and gone. In an MMO or standalone RPG, character death is generally something between a minor inconvenience and no trouble at all. Think about it, if death is not to be feared, it matters much less what you do during your play, and you can pay less attention to it. The details of play tend to blur because your full attention isn't required. (Megaman 9 (for example) shows how even a minor fear of death changes a game immensely. See http://www.gamasutra.com/php-bin/news_index.php?story=21324.)
The co-creator of D&D (Gary Gygax) put it this way in one of his last publications (Hall of Many Panes) “a good campaign must have an element of danger and real risk or else it is meaningless - death walks at the shoulder of all adventurers, and that is the true appeal of the game.”
"Pharming" highlights both sides of this problem. If people enjoyed playing the games, would they buy characters and items from pharmers? And if the games ordinarily required more than a dreary, predictable "grind", could pharmers produce enough such items for the demand? At the very least, the scale of pharming would be much smaller.
Obviously, a good human referee can provide more interesting adventures than a computer. Moreover, in D&D the actions of a character can change the future, whereas in MMOs that’s rarely the case because they’re designed for thousands of players. Once again, if what you do makes no difference, you’re less likely to pay attention to, and care about, what you do.
Similar Trends in Tabletop D&D
In tabletop Dungeons and Dragons itself we can see an evolution toward this same fixation on "leveling". Second edition D&D is much like First; Third Edition D&D (3.0) is a very different game, a kind of fantasy Squad Leader, with the emphasis on players finding ways to "minimax" the system via unearned advantages (such as myriad books and articles containing new feats, skills, and prestige classes). Each character can be a one-man army, very different from First Edition where "combined arms" cooperation was absolutely necessary to survival. In First Edition fighters cannot withstand the enemy without magic-users who deal massive damage to groups, and magic-users cannot survive if the enemy gets to melee range without protecting fighters. Characters must help each other out, and each kind of character class provides an important component of "combined arms" success. (Clerics provide defensive magic and medical help, rogues provide scouting and stealth, etc.) It is rather like American football, with fighters as linemen, clerics as linebackers, rogues as wide receivers and secondary, and magic-users as quarterback and running backs. Just as a football team will fail if some of its parts fail, the First Edition adventure party will fail if some of its members fail.
In Third Edition, every character type is designed to survive pretty well on its own. Part of this evolution is attributable to the reduction in size of the typical adventuring group. One of "Lew's laws" is "the survivability of an adventuring group varies with the square of the number of characters in it". Our First Edition parties averaged seven or eight characters; Third Edition specifies four. 3.5 is essentially the same. When there are only four characters, there's rarely a practical way to prevent the enemy from getting to the magic-user(s), who must then be able to cast spells in the face of melee opposition, who must be harder to kill, and so forth. Fighters, with the proper feats, can kill several ordinary enemies in one blow. And with "buffs" from the spell-casters, a fighter can take on a ridiculous number of monsters.
Further, you are supposed to rise a level in about 11 encounters, and could have several encounters in one adventure. In other words, leveling can occur so often that leveling can become the objective, rather than focus on enjoying the adventure. When I set out to convert some First Edition characters to Third, the first thing I did was double their level to be at a near-comparable place in progression. The game was also designed to scale up to 20th level (and later 40th), whereas First Edition starts to break down when characters got well into double figure levels.
Fourth Edition D&D is for larger adventuring parties, and characters have many powers that only help other people in the party, not themselves. It appears to be designed to encourage groups to work together. Character "roles" have been added to emphasize cooperation and "combined arms". Individual characters are very hard to kill, but don't have a lot of offensive capability. Yet the general take on Fourth Edition is that it has been "WoW-ified", made to be more like World of Warcraft, with easy leveling and all the other things that have made WoW so widely popular. Fourth Edition may be a good game, but it's not D&D.
[Fifth edition D&D is much like First, except that it’s much harder to get killed because of easy healing and spells such as Revivify at third level cleric.]
Is this “bad”?
Is it "bad" that people play for the destination rather than the journey? In and of itself, no - every person has his own reasons for playing a game, and those reasons vary drastically. These people can enjoy the game, even if they're not having fun. Yet when the result is something that's more like work than play, you have to wonder what is wrong. Yee quotes a registered nurse who played Everquest: "We spend hours - HOURS - every SINGLE day playing this damn game. My fingers wake me, aching, in the middle of the night. I have headaches from the countless hours I spend staring at the screen. I hate this game, but I can’t stop playing. Quitting smoking was NEVER this hard." Maybe there IS something wrong here.
Further, when games are designed to emphasize leveling up, those who want to "enjoy the journey" are left behind. Is there anything game designers can do to help restore the fun? We can’t quite put the creativity of human referees into computer games. But already in some games, what a character does changes the world according to his view of it. (What the players do very much affects EVE Online.)
We're in "the age of instant gratification". Levels are easy to earn because video gamers expect to be rewarded at every turn. 30 years ago, experience points and the occasional magic item were sufficient reward; now expectations have been raised, and levels are the expected reward. If a designer takes away those easy levels, will people play any more? What a difficult situation! I've designed many commercially published or forthcoming boardgames, but I've only once tried to design a role-playing game - though it was a board game, not a typical RPG - and now I wouldn't even contemplate it because of the problems I’ve described.
Games are entertainment, not Life
Younger readers might howl that video games are NOT easy. Yet most long-time players recognize that, generally speaking, it's typically a lot easier to succeed at a video game than it was decades ago. Death has no sting, games are automatically saved for you, heck, some games even aim your gun for you! I'm not saying that easier is "bad", because it's what the market requires, so that people don't have to work for their entertainment; yet somehow, the entertainment has become too much like work for the hard core players, even when they're successful.
Fundamentally, then, it may be that these games aren't as fun as old D&D can be because they are designed to stroke the egos of pseudo-competitive people who think they've accomplished something important when they reach maximum level. Good D&D players know better. I remember a teenager who had an "18th level magic user", but had no clue how to play it well. He may have made it up (rather like buying an account, but much cheaper!), or he may have played with a "Monty Haul" referee. Your level didn't say anything about how well you played, and for that matter nobody outside your little group cared how well you played–you weren’t competing with the rest of the world. We played to have fun, not to brag about our level or our loot (though we surely enjoyed such things when we attained them).
"Casual" players in general, and Nintendo among major publishers, haven't forgotten that games are entertainment. You don't prove anything about your worth by being a "bad ass gamer", you don't help your family, your friends, your country, your world. Commercial video games are not training for life, they're a pause from life if not an escape from life. It just doesn't matter whether you "beat the game", or how quickly you beat the game, any more than it matters whether you complete a crossword puzzle or Rubik's Cube. Casual players know that; some hard core players seem to have forgotten it, and those are often the people who "grind", who don't enjoy the journey, because they think "beating the game" is truly important even as the rest of us wonder where they got such an unrealistic, immature notion.
1 , 2 , 3 , 4 , 5 Next »