Multi-Classing

The Valley of the Blue Snails, a really interesting blog mostly about an unusual setting that Canecorpus has created, has a post about Multi-Classing in his D&D setting:

Valley of Blue Snails: Multi-Classes Revisited

I will be changing a few of the multi-class titles though I’m a bit mixed on what direction to take it. The titles are similar to normal class titles (Veteran, Cutpurse, Wizard, etc) in that they are mostly for fluff with perhaps a minor ability to adhere the two classes better. I’m deciding on wither to make it very setting specific or use more intuitive titles.

Example, a Fighter-Cleric would be a Paladin. Pretty intuitive. Setting specific would be something like a Dwarven Fighter-Cleric would be a Whitebeard. Not so intuitive but perhaps a better choice since this sort of multi-class fluff is well outside of the realm of B/X anyhow. The main problem is the setting specifics titles would indeed be rather specific, slanting towards race with specific classes.

I did something similar for a (for now abandoned) retro game I was working on, which I might as well share in case somebody finds it interesting:

Primary/Secondary Fighter Mage Priest Thief Actor Ranger
Fighter Warrior Magic Knight Paladin Brigand Swashbuckler Barbarian
Mage Wizard Mage Seer Warlock Witch Hermit
Priest Monk Thaumaturge Priest Charlatan Oracle Druid
Thief Rogue Mountebank Fraud Thief Spy Outlaw
Actor Bard Conjurer Idol Jester Actor Minstrel
Ranger Scout Shaman Pilgrim Vagabond Emissary Ranger

Basically, there are six primary classes (one for each of the six standard stats) and they combine into 36 different classes, with differing emphasis depending on whether a particular class is primary or secondary.  Somebody who’s primarily a Thief but uses magic to steal and con is a Mountebank, while somebody who is primarily a Mage, but uses stealth and deception to accomplish his ends and impress people with his power is a Charlatan, etc.  You mostly got the armor restrictions of your primary class, and the weapon restrictions of your secondary class, with most other abilities splitting the difference.  Spell user progressed as in their primary as if they were one level lower, and their secondary two levels lower.  And so forth.

I actually think it’s pretty workable, but it’s not something my main face-to-face play group would be interested in, and I have too much on my plate right now to pursue it further.  If I start a play-by-forum or play-by-post campaign, I’ll probably use Tunnels & Trolls instead of trying to sell people on and play-test some wacky homebrew.

What is Role-playing?

The Fine Art of the TPK asks
A short question, but by no means easy…

Instead, I have a question. An open call, if you will. Can somebody PLEASE define role-playing? Somebody will be a wise-ass and link the wiki stub, so I’ll just get it out of the way.

http://en.wikipedia.org/wiki/Role-playing

Specifically, I’d like to know how or why one game would have it in any more or less abundance than any other. You folks are incredibly bright, but you bicker over minor details WAAAAAAAAY too much.

Role-playing is playing a role. It is the player getting to make in-character decisions, and making those decisions for the character as if the character’s motivations, personality, goals, and such make a difference, so the character is being driven by their inner mental life. For instance, in role-playing, you try to maintain a distinction between what the player knows and what the character knows. (That’s not completely dispositive, since some hard-core wargames with fog-of-war rules also try to impose that distinction–though most of the time they will try to conceal certain knowledge from the player, such as by having chits upside-down until revealed, rather than asking the player to simply act as if he didn’t know what was there.)

It can be done from various stances, such as trying to imagine what it might be like to be that character, or by trying to construct a story so that the character seems psychologically plausible the way characters in good (or even not-so-good) fiction are.  The goal is to make it so that explanations of the character’s actions refer to the character’s role, and involve things like knowledge, beliefs, and desires, and to avoid making it so that the actions can only be explained by referring to things outside the role such as the actions of certain rules (“My character hates to see animals abused and goes berserk, so I held my action until the evil guy kicked the puppy because I wanted to trigger his Rage ability for the upcoming fight”)  or meta-game situations (“Carla has to leave in 20 minutes, so I attack the guy with the flag of truce.  We might as well get one combat in this session”).  Even worse is when rules or meta-game considerations prompt actions that are contrary to the role:  “My saintly pacifist attacks, because we might as well get one combat in this session.”

From this point of view, it’s obvious that some games are better or worse for role-playing. Despite the fact that you are assigned identities in Clue, it’s not a role-playing game. It doesn’t matter for game-play whether you’re Colonel Mustard or Miss Scarlet, and you would be considered strange or playing a prank if you insisted on trying to play it as if Colonel Mustard and the other characters had a distinct personality and approach. “Col. M wouldn’t think that Miss Scarlet was capable of the murder, because he’s a chauvinist, and a rope is not a woman’s weapon.” Even if the other players humored you and let you play that way (or you concealed the reasons for your decisions), the game doesn’t support role-playing and you’d be at a distinct disadvantage compared to playing it as intended, where the piece is just a token to push around the board.

Games that are intended as role-playing games can have features that aid or hinder players in making in-character decisions. In some cases, they might even make it impossible to make in-character decisions for certain situations; if those situations come up frequently in the game, the game is objectively worse for role-playing than the same game without those features.  For instance, games with lots of coercive personality mechanics can be hell on role-playing.  Even though they’re often built so that you can make a narrative that sounds as if it’s talking about mental state, the actual facts are it’s a narrative about game state that’s out of the player’s hands.  The character did what he did because the rules and dice said he had to, not because the player played it that way.  “I say my character is brave, the stats on his character sheet say he should be brave, but every combat we’ve had so far the unlucky die rolls say that he’s run away.”  Or, in My Life With Master, you’re not making character decisions, you’re rolling to see what the character decides and narrating around that.  The game rules reach in and flip the character’s mental state, and the player carries it out.

Games can also make it difficult to role-play by putting too much knowledge or narrative power in the player’s hands.  Just as it can be a lot to ask of a war-gamer that he move his units as if he couldn’t plainly see that cavalry screened by the woods, ready to charge his flank, it can be a lot to ask of a role-player to separate what the character would want to have happen from what would make the most sense in the game world, or what would make the most interesting story in retrospect.

Games can also fall down by making the player have to care about things that the character cannot in principle know.  I’ve gone on at length before about how 4e’s Skill Challenge system falls into this category, so I won’t repeat it here.

Are there things that games can do to actually enhance role-playing, and make it easier?  Sure.  The very fact that games can be separated into role-playing games and non-roleplaying games shows that there are.  The major thing is to make it as much as possible so that character reasoning and game-rule reasoning are congruent, and that the game is responsive to logical actions of the characters. The biggest thing that RPGs can do to emphasize the RP part is get out of the way.  Every time you tell the players that even though it would make sense for the characters to try X, they can’t because there’s no rule for it, you kill role-playing a little.  Every time you invoke a rule that changes the state of the game world in a way the characters can see and react to, but you can’t actually explain what it was that they saw happen (Own the Battlefield, I’m looking at you!) you kill role-playing a little.  If it’s not possible to eliminate some arbitrary construct in the rules, it can often at least be made real in the game-world so that the characters can think about it.  E.g. if there are things in the game that depend on level, such as spells that won’t effect people of certain level, or have a duration based on level, it can be a big help to role-playing to have level be something that the characters can know and talk about.  Russell does this in his Hero Cults D&D setting, where levels are actual ranks in a quasi-religious hierarchy.  The rules should emphasize giving information at the character level, and explicable in terms of things the characters understand, and game-play should emphasize overriding the rules whenever they give a result that forces the players out of playing the role and just into accepting that’s how things are because the rules say so.

RPG Systems and Granularity

Dr. Checkmate, guest blogging over at Uncle Bears, writes:

    • On a related note, d4 to d12 (or d4-2 to d12+2) doesn’t allow for a whole lot of granularity. You’re basically talking about all traits being on a scale of 1 to 5. Even some how making it a scale of 1 to 10 would be an improvement.

I know what he means about granularity, but my experience is that more than about five doesn’t actually make much of a psychological impact.  Too fine a gradation, even if statistically significant, tends to get lost in people’s mental model of how things work.  Despite D&D 3+ grading attributes on a 3-18 scale, what actually matters is the -2 to +4 that usable characters tend to end up with.  Similarly, even though each Skill rank in D&D “matters”, the difference between 7 or 8 ranks in a Skill tends not to get noticed.  Even in systems like Hero and GURPS, which have you rolling 3d6 against a stat, the bell-shaped curve means that some points are more equal than others.   In my own home-brew before I switched to Savage Worlds I used a 1 to 10 scale for both Attributes and Skills, but realistically PCs had about 3-8 in anything the actually did (except for some combat monsters that I actually kind of wish weren’t so crocked).  Having a smaller spread in the general stuff but extra Disadvantages/Advantages actually seems to help players think of the characters as having distinct strengths and weaknesses, as well as opening up more actually playable characters. E.g. middling Dexterity stat but Fumble-Fingers Disad giving a minus to fine manipulation is more memorable and easier to work with than an rock-bottom Dexterity score, which in many systems is a death-sentence.

I sometimes wonder if something like the seven-plus-or-minus-two rule is at work here.  If a player can’t distinctly visualize all the steps at once, do they just chunk it until they can?

Not Everything Can Be Near

…because where would you put it?

In the previous post, I talked about Near and Far thinking in RPGs, and recommended that the GM try to make as much as possible in the game amenable to Near thinking.  As much as possible doesn’t mean everything, though; there are situations where it’s either not possible, or not desirable.

  • If the GM and the players don’t know (and can’t be expected to learn) enough details.  E.g. open-heart surgery, or starship hyperdrive repair.  In the former case it’s conceivable (barely) that in a game that’s about being a surgeon it would be worthwhile to learn enough about surgery to not only provide accurate description, but enough real choices of the sort that surgeons face to make Near thinking possible; in the latter, the details just don’t exist, and while the GM could certainly make them up and try to teach them to the players, the amount of effort involved to get the kind of free-wheeling thinking of fully grasping the problem-space as when a player thinks about searching an ordinary desk doesn’t seem like it would pay off, even in a campaign about starship engineers.
  • If the situation is about performance, not decisions.  When the task at hand is something like playing the cello, it doesn’t really matter exactly what the GM or the player knows about cellos, or even music in general, because it’s the character’s physical skill that’s called on.  Now, if you were to search a cello…  Note that this is often going to be true of the physical activity of combat.  The strategy and tactics are decisions that can be carried out by the player, the physical activity of shooting the bow or swinging the sword is all the performance of the character.
  • If it’s about the character’s skill at making certain kinds of decisions.  Even if the GM and the player both understand what’s involved enough that they could go into detail, sometimes it’s about what the character can think or understand, not the player.  It’s often the case that the character is supposed to be better at thinking about certain situations than the player (sometimes the other way around).  In these cases it’s possible to use a skill roll to backstop or supplement the decisions that the player makes, but much of the time you should just substitute Far thinking.  Even if the GM and the player both know how to play chess, actually playing out the match between the character and Death isn’t likely to be a satisfying way of resolving it.
  • For pacing reasons.  There’s only so much time in a session, so sometimes even if the characters would have time to go through all the gory details the game is better off if you hand-wave it.  You don’t want to do too much of this, though.  It’s easy to imagine that you’re getting more done in the game when you fly by everything at 30,000 feet, using Far thinking all the way, when actually you’re just leeching out all the color and vibrancy and eliminating potential decision points.   You should only use this as an excuse when spending the time in Near mode is going to freeze out the other players for too long, or you know that they find that particular activity boring to think about in detail, or it lets you get to a different and more interesting Near mode episode immediately.

Near vs. Far Thinking in RPGs

    • The latest Science has a psych article saying we think of distant stuff more abstractly, and vice versa.  “The brain is hierarchically organized with higher points in the cortical hierarchy representing increasingly more abstract aspects of stimuli”; activating a region makes nearby activations more likely.  This has stunning implications for our biases about the future.

      All of these bring each other more to mind: here, now, me, us; trend-deviating likely real local events; concrete, context-dependent, unstructured, detailed, goal-irrelevant incidental features; feasible safe acts; secondary local concerns; socially close folks with unstable traits.

      Conversely, all these bring each other more to mind: there, then, them; trend-following unlikely hypothetical global events; abstract, schematic, context-freer, core, coarse, goal-related features; desirable risk-taking acts, central global symbolic concerns, confident predictions, polarized evaluations, socially distant people with stable traits.

Robin Hanson wasn’t thinking about roleplaying games when he wrote this, of course, but if he and the Science article are right about how minds work–and I think they are–then it has implications for how we play these games.  For one thing, it means that providing detail and concreteness isn’t just a matter of atmosphere and aesthetics, it literally changes the way we think about events in the game.

Take an example near and dear to my heart, the act of searching in-game:

Near

The GM determines there is a desk with three side drawers and a middle drawer, and taped to the underside of the middle drawer is a key.  The desk otherwise contains papers from old cases, none of them relevant, a gun in the top right-hand drawer and a bottle of rye in the bottom right hand drawer.
Player
: I search the desk.
GM
: How?
Player
: I look in all the drawers.
GM: You find a gun in the top right hand drawer, a bottle of Rye in the bottom right hand drawer, and a bunch of papers.  They seem to be old case files.
Player
: I flip through them and see if any seem relevant.
GM
: Based on a casual flip through, none seem particularly interesting.
Because the player didn’t specify any action that would have uncovered the key, it’s not discovered.

or

GM: How?
Player: I look in all the drawers, then I take them out one by one.  I check the bottoms, and I look for false bottoms, and I check the holes, reaching around if necessary.
GM: That will take about fifteen minutes.
Player: I’ve got time.
GM: Ok, taped to the bottom of the middle drawer you find a key.  You also find a gun in the top right-hand drawer, and a bottle of rye in the bottom right-hand drawer.  There’s also a bunch of papers, that seem to be old case files, none particularly relevant.

Not as Near

GM determines the same set-up as before.
Player
: I search the desk, looking in all the drawers.
Because the player didn’t specify actions that would uncover the key, the GM rolls the Player’s Search skill as a “save”, and gets a success.
GM: You find a gun, and a bottle of rye, plus some old case files.  On an impulse, you check under the drawers, and find a key taped to the bottom of the middle drawer.

Even Less Near

Same set up as before.
Player: I search the desk.
GM rolls vs the character’s Search Skill, and succeeds.
GM: You find a key taped to the bottom of the middle drawer, a gun in the top right-hand drawer, a bottle of rye in the bottom right-hand drawer, and some old case files.
If he had rolled a failure, the Player would still have found the gun, the files, and the booze, but not the key.

Far

The GM determines that the desk contains a gun, and a hidden key.  He doesn’t bother to think about where.
Player: I search the desk.
GM rolls, and the character fails.
GM
: You find a the gun, but nothing else of interest.

Even Farther

The GM determines that the desk contains a gun, and a key.  He doesn’t bother to think about what the desk looks like, where the items are or whether they’re hidden.
Player
: I search the desk.
GM rolls, and the character fails.
GM: You find nothing.

Really Far

The GM doesn’t bother to determine anything about the desk.
Player
: I search the desk.
GM rolls, and the character succeeds.
GM: You’ve got 1 success.  You need 2 more before you get 1 failure.

Just Plain Wrong

The GM determines the details as in the near cases.
Player: I look in all the drawers, then I take them out one by one.  I check the bottoms, and I look for false bottoms, and I check the holes, reaching around if necessary.
GM rolls, and the character fails.
GM: You find nothing.

Also Wrong

The GM doesn’t determine any details, but does determine the desk contains a gun and a key.
Player
: I look in all the drawers, then I take them out one by one.  I check the bottoms, and I look for false bottoms, and I check the holes, reaching around if necessary.
GM rolls, and the character fails.
GM: You find nothing.

The thing about Near vs. Far is that it’s (probably) not a continuum, where you gradually lose detail and concreteness as you dial up the abstraction: at some point there is a modal shift in the kind of cognition you do.  I think that wherever possible, you want to keep things in the game world as Near as possible, so that the players remain grounded in the situation. This lets them reason about the game world, and not just about the rules.  It also provides more specific details to make the story more vivid, because it’s more like what we do when we’re faced with such situations in the real world.  Using Far abstractions is like having a scene cut to a placard that says “They search the room” and then cut back to show what they discovered.   If the GM doesn’t provide enough details that they could reason concretely (even if he backstops them with abstract game mechanics), then the players just move through a sort of fog of abstraction.  Everything their characters do seems to them to be more distant in space and time, and they’re more likely to group things mentally into larger, coarser categories, which can make it harder to keep their interest and attention since more stuff will be regarded as “the same old same old.”

Providing enough detail to make Near thinking possible in an RPG is more work for a GM, but I think it’s really important work, and pays off in making the experience much richer for everyone concerned.  When budgeting your effort in preparation, try to spend it on the details that the players will actually interact with to make the setting more concrete, and less on figuring out the broad strokes of distant event and times that shaped the game world.  A list of ten things that they can find in the desk beats 10,000 words on the lost empires of the Hyperborean Age.

Keep Your Filthy Narrative Out of My Roleplaying

My friend Russell writes

I think this is exactly right, at least as far as my tastes go.  Broadly speaking, there are three commonly found attitudes towards what you’re trying to accomplish when you play a roleplaying game.  I don’t want to resurrect the taxonomy wars, so I won’t label them, but the basic breakdown is:

  • Roleplaying games are about experiencing what it’s like to X
  • Roleplaying  games are about constructing stories that are like X
  • Roleplaying games are about playing a game (often a war-game) that draws elements from X

The problem is that these modes are largely incompatible.  If you’re trying to experience what it might be like to be faced with situations and making decisions in the game world, the last thing you want is to have narrative control over the game that the character doesn’t have; how can you face any uncertainty over whether your arrow will strike true when you can just declare that it does?  If you’re trying to play a game to exercise your tactical judgment and formulate clever strategies, it’s damn well cheating when the referee just overrules them in the name of plot.  If you are collaboratively writing a story in your favorite genre, it’s madness to allow that story to be warped or even ended prematurely by something as arbitrary as a bad die-roll.

This isn’t just idle speculation or caricature, these are genuine and deeply felt objections by people who are looking for a certain kind of entertainment from RPGs.  Take this guy gal, for example:

    • In addition, I challenge the entire premise [that “Character death should be a normal part of a well balanced but challenging adventure with natural consequences for poor choices.”]. Books and movies are excellent examples of my point of view. The main character isn’t going to die and you know it the entire time. No matter how steep the cliff, how deadly the bullets, how invasive the poison, the hero lives and we still have engaging blockbuster films and New York Times Bestseller novels. Why? Because the Story is Just That Good.

Leaving aside  the question of whether blockbuster films and bestselling novels really are Just That Good, or whether they’d be even better if there was some actual uncertainty as to the outcome, this is clearly a guy gal who is not only looking for a way to construct stories, but doesn’t even have a glimmer that there might be people looking for other things, people who would therefor find the justification that something happens in films and novels to be unpersuasive, if not a complete non sequitur. (Or maybe I’m just reading to much into his her “challenging” the premise rather than simply disagreeing with it.)

What bothers me is not that the folks who are primarily about constructing narratives exist (de gustibus),  but the blithe assumption that everybody else who plays RPGs shares their tastes, even if they don’t know it yet. (BTW, I don’t intend to single out Viriatha above as an example of that.  I’m talking more about an attitude I perceive all over the place in posts on how to structure your roleplaying session as if it were scenes from a movie, how to design your villains to play up the themes of the story, how to drop detail and consistency from the setting if it doesn’t feed into the main narrative, and so on.) What I miss is any sense that “Your mileage may vary.”  It’s not that I want to see every blog post or forum comment come with a disclaimer “only suitable for certain tastes in roleplaying”, but that I think the advice would be sharper and more on-point if the authors kept in mind that they’re talking about a specific approach to RPGs.  For one thing, they’d spend less time running down the alleged flaws in other styles of RPGing, which should give them more time to devote to their particular style.  More than that, though, I think that the recognition that they are aiming to accomplish one particular kind of thing by playing RPGs would help them separate the wheat from the chaff for their approach; there are a lot of things that are carried over from game system to game system in our hobby because that’s what people are used to, but are irrelevant if not counter-productive for certain styles of gaming.  The result, it seems to me, is a lot of patching of things that get in the way when they should be jettisoned instead.

Take, for instance, Fate or Hero Points.  Such things are often added to systems that have important things, like character life or death, decided by a random die roll, to give players a measure of narrative control; the justification is almost always along the lines offered above, to make the game more like a blockbuster movie or bestselling novel.  The problem is that this is a band-aid.  If what you’re aiming for is a properly-constructed, satisfying story, having a limited number of times you can overrule a story-killing die roll makes no sense.  An unsatisfying end to the story doesn’t become more satisfying because at least you managed to avoid derailing it the first three times it happened before you ran out of Fate points.  You shouldn’t be rolling dice if you don’t want a random outcome.

On the other hand, and this gets back to my original point and the title of this post, if having a limited pot of Fate Points is insufficient to satisfy the legitimate desires of those who are playing for narrative, the existence of such things in the system screws up the legitimate desire of those who are playing for the experience of it to not be forced to confront profound game decisions that can’t be made in character.  I don’t want narrative control when I’m trying to imagine the experience of the character, because it screws it all up; if the character actually had that control, the story would turn into simple wish-fulfillment, if not an outright Mary Sue (as well as breaking a lot of settings where there’s no conceivable reason that a character would have that kind of power).  The more important and the more fraught with consequence the moment is, the less I want to be jerked out of it by meta-game considerations.

Similarly, from the other direction, there are those who think that “something must be done” to prevent the horrifying possibility that some logical, perfectly consistent feature of the game world (such as encountering something unexpected when crossing the dangerous wilderness) could screw up the game balance, so that the set-piece encounter at the end of the journey is no longer a fair contest or the wealth-per-level guidelines get thrown out of whack.  Again, it’s not that they’re wrong to want the game the way they want it, but a greater recognition of what their particular desires are would probably help them narrow the focus of the game to what they actually enjoy.  If you’re going to remove the random encounters as being a pointless and potentially unbalancing distraction from the encounters in the dungeon, you should probably go ahead and remove the travel to the dungeon as well.  Why should there even be a situation “The PC’s are heading to the dungeon and will eventually get to the dungeon, but not this session, and they need a combat to get them moving.”  Just wave your hands and say “Three weeks later you arrive at the dungeon.”

You want a laser-like focus on what you and your players actually find fun, and you want to ruthlessly trim the things that get in the way of that.  But to do that, you need to understand what it is that your players actually want to accomplish by playing RPGs, and to do that you have to keep in mind that what they’re after might not be the “obvious” point of roleplaying to you.  Otherwise you might find that you’re trimming the reason that they enjoy playing, and focusing straight on what they are trying to ignore.

The Rule of Goofy

Actually, I think there is a Rule of Goofy, and it goes something like this:

The ability of an element to shatter the Suspension of Disbelief is directly proportional to how goofy it is. Reality is no excuse for fiction.  If the audience finds something so goofy that they are thrown out of the moment in order to analyze or scoff at it, it doesn’t matter how realistic it actually is or how well-documented it is that such things occur either in the real world or in the genre.  Presenting evidence might get them to move past it, but the damage is done and the momentum is lost.

For RPGs one might add that the element might be a result coughed out by the game system that, while 100% accurate to the rules, is goofy in context.

While Dr. Checkmate is correct that one man’s cool is another man’s goofy, that doesn’t mean that the rules can’t be usefully applied, only that you have to know your audience.

The Rule of Cool: A Useful Tool

The Geek, at Geek Related, writes:

    • I’ve been following the debate about the so-called “Rule of Cool.”  It’s a “TV Tropes” concept extended to RPGs by  the Chatty DM, (original post “The Rule of Cool” here, and clarification “The Rule of Cool Takes Flak” here).  A number of people gave it drive-by disses, but I think the most on topic one is from 6d6 Fireball, with Rule of Cool – Only for Idiots and Of Coolness and Idiocy.

      In short, the Rule of Cool states “The limit of the Willing Suspension Of Disbelief for a given element is directly proportional to its degree of coolness. Stated another way, all but the most pedantic of viewers will forgive liberties with reality so long as the result is wicked sweet and/or awesome. This applies to the audience in general, as there will naturally be a different threshold for each individual in the group.”

      If you interpret it very loosely as “Hey, toss in some cool stuff to spice up your game” it’s fine.  But the way it’s stated is setting up “cool” as being carte blanche to roll over realism/suspension of disbelief.  “If it’s cool enough, it can be incoherent and it’s all good.”

First off, as a psychological observation, The Rule of Cool is simply true.  This is a form of “Confirmation Bias“: people find it very difficult to notice discrepancies and logical errors in things that they are favorably disposed towards.  Contrariwise, they’ll nitpick to death something that they find disagreeable, boring, or challenging to their preconceptions.  Indeed, I’d say several of the bloggers who strongly object to Chatty DM’s post are displaying that very behavior.

Second, the Rule of Cool is part of the basis of the hobby.  Practically every RPG relies on the Rule of Cool to excuse inconsistencies and absurdities in the setting.  If your players see a dragon and don’t immediately start in on how such a creature violates the square-cube law and should barely be able to walk when not buoyed up in a swamp, let alone fly, and breathing fire is absurd, why the caloric requirements alone…that’s because they think it’s cool and are willing to suspend their disbelief to that extent.  Magic, psionics, Cthulhoid horrors, vampires, sexy secret agents licensed to kill, giant mecha, Wild West zombies, super heroes, intelligent bunnies, faster-than-light travel, net-running deckers, swashbuckling pirates, private detectives solving murders, artificially intelligent robots, aliens…if it’s fodder for RPGs, it requires a large dollop of willing suspension of disbelief, and that disbelief will only be provided by people who think those things are cool enough to be worth pretending to believe.  Geek culture has become so entwined with pop culture in the past few years that fans of fantastic literature (which is most RPGers) can lose sight of the fact that not everybody really thinks this kind of stuff is cool; there are still plenty of people who think it’s all dumb and puerile, absurd escapist crap that doesn’t deserve any suspension of disbelief.  There are people who look at Spider-Man or The Dark Knight with the same visceral revulsion for the CGI and melodrama being offered as “cool” that others do for Michael Bay or Uwe Boll movies.

Third, as a piece of advice Chatty’s take on the Rule of Cool provides a useful approach to what to spend your limited time and effort as a GM to prepare and convey during a session.  You are better served spending your time making sure that your game is going to be enjoyable to your players so that they’ll want to overlook the inevitable holes than trying to make sure there are no holes to be found without regard to whether it satisfies the players.  It’s an RPG, it’s going to have holes–you can’t present an entire world, even a perfectly mundane world, in its entirety, in the form of a game without gaps or errors–if the players are in the mood to, they’re going to be able to quibble over events and raise objections even if you’re doing nothing more than reprising your day at the office in a session of “Papers & Paychecks.”  No matter what you and your players think of as cool, there is a hurdle to overcome in RPGs that isn’t there in more passive consumption of media in getting the players to engage the world…they can’t let it just wash over them and still be playing, so you have to make them want to play.  And to do that, you’re going to have to grapple with what it is that they think is cool enough to be worth it.

Finally, it’s completely irrelevant whether you happen to be tickled by Chatty’s particular examples; if you don’t think that stuff is “cool,” substitute what you do think of as cool.  And you can’t weasel out of it by claiming that cool by definition means CGI explosions and running up streams of bullets to kick somebody in the face…that’s a straw man.  If you and the players actually think that’s cool, then it wouldn’t be an objection; it’s only when you believe that it’s over-the-top and uncool that you re-engage your critical faculties, and the whole point of the Rule of Cool is to provide what the audience/players actually think is cool.  The TV Tropes site that it’s taken from is absolutely explicit:

Note that you only get to invoke the Rule of Cool if the end product is, in fact, cool. Note also that different opinions on what is “cool” create the most arguments over this.

Instead of trying to come up with an uncharitable reading of the Rule of Cool to make it self-evidentally stupid (If you add enough CGI explosions you can hide any ludicrous inconsistency or plot-hole! Not being obsessed with consistency is the same thing as not caring about it at all!), detractors ought to think about what it is they find cool about the settings they enjoy playing…and then try to think about what somebody with a more jaundiced eye would find absurd and disbelief-destroying about that setting.  Then they might begin to apprehend both what the Rule of Cool is really about, and how to use it to improve their games by emphasizing what the setting provides in preference to all other settings (what they really do find cool about it) to help the players enjoy the setting for what it is and ignore the inevitable gaps and debatable points. It’s not carte blanche to roll over suspension of disbelief, it’s an encouragement to analyze what it is that causes people to engage their suspension of disbelief and provide more of whatever that is.

Let’s Get Critical!

Critical hits are fun.  Players enjoy big, flashy unusually good events.  Some enjoy them so much that they play systems where they can narrate them right in, instead of waiting for the dice to serve them up, but that’s a topic for a different day.  This was driven home to me when I was running games with my home-brew.  It was a skill + roll system based around 2d6 but didn’t contain criticals, automatic hits, or fumbles.  Every time a 12 came up there was a murmur of excitement around the table, followed by a sigh of disappointment when the players realized that it wasn’t a critical hit–in fact, due to the slightly unusual way the dice were read*, a 12 was usually a failure.  After a couple of months I finally gave the players what they were looking for and made 12 a critical hit, giving max bonus and a special result on top, and the cycle of Woohoo!  Awwww…. was over.

Critical hits are one of the first things that DMs think of adding to an otherwise fairly abstract combat system like D&D, and some games became notorious for their critical hit charts.  Since they only get rolled once in a while, it’s possible to have a big chart with really detailed results without slowing things down much at all, and the chance of getting, say, a broken arm instead of just 8 hit points gave combat a grittier feel that a lot of players really appreciated.

The biggest problem with critical hits is that in combat heavy games there’s a built-in asymmetry between the PCs and the NPCs even if they’re using the same rules.   PCs get a lot of dice rolled against them during the course of a campaign–orders of magnitude more than any individual NPC that they might encounter–and depending on the system they may well get more rolls against them than they make even in an individual combat, between often being outnumbered by the monsters, many monsters getting multiple attacks per round (the infamous claw/claw/bite) and PCs usually having lots more hit points before they are rendered hors de combat (once you figure in magical healing).  That means that even really unlikely events will eventually hit the PCs, and on the whole the PCs will take more criticals than they dish out.  At which point the rules that were originally added to give the players some more WooHoo! end up serving up heaping helpings of Oh Crap! instead.  Insta-Kill crits are particularly unpleasant in this regard.  And, as commenter Scott said over on the post Making Critical Hits More Interesting at Inkwell Ideas “a smashed ankle matters very little to the NPC who’s going to die in a couple of rounds, but very much to the PC who’s going to suffer until he can get a heal cast.”

A second, lesser, problem is that with systems that keep criticals fairly abstract (say, by awarding double damage but no extra result beyond that) it’s possible to get a critical hit but follow it up with a disappointing roll for damage…the fact that you’ve done 2 points instead of the 1 you would have rolled is cold comfort, and in terms of the emotions that rolling dice have added to the experience, you’d probably be better off not having rolled a critical in the first place.  It becomes an artifact of the abstraction mechanism rather than a proxy for a game-world event; in the game-world it’s presumably not “My arrow hit him in the eye slit!….But it doesn’t seem to have slowed him down any….”  And if that’s at all a common result of rolling a critical, you have to start asking whether it’s really worth having them in the game.

So, my suggestions for treating critical hits in games like D&D are as follows:

  1. Have them be something PCs do to NPCs, not vice-versa.  Or, if symmetry between PCs and NPCs is important to you (so there’s no “PC glow”) then at least have the NPCs criticals do abstract damage, such as double damage, instead of rolling on a chart for specific results such as limb amputation.  Otherwise you have to be prepared for most PCs to die or suffer career-ending injuries a lot sooner than their toughness as measured in hitpoints and armor class would otherwise indicate.
  2. If you want to have PCs sometimes face the possibility of a long-term or crippling injury, tie it to something less common than a 1 in 20 shot critical hit.  One neat idea (borrowed from Savage Worlds) is tie it to the PC becoming incapacitated.  In D&D that would mean getting knocked down to zero HP.  Whenever the PC hits 0, then roll on the injury chart (possibly the same chart as the PCs have been dishing out to the NPCs); have the penalties for the injury persist even in the face of magical healing unless extra time and a Healing skill roll is made, or a more special-purpose spell (such as regenerate) is used.  If you just slap Cure Serious Wounds on somebody with a shattered ankle, they get the hitpoints back and can fight again, but the ankle has been healed crooked.
  3. If you’re using abstract damage criticals, either just award max damage for the dice (so a crit on a d8 weapon automatically does 8 points, which is about the expected value of rolling 2d8 anyway), or if you insist on rolling have a minimum of the expected value. E.g. Roll 1d8 and multiply by 2, but have it be 9 points minimum (2 * expected value of 4.5) so that you avoid the WooHoo! Awww phenomenon.

* instead of adding the two dice, you used whichever face was lower.  Doubles were zero.  This yielded results from 0 to +5, weighted towards the 0 end; this meant you always performed at least as well as your skill (a concept borrowed from CORPS) so you never had to roll for tasks with DC <= your skill, but you had a decent chance of getting slightly better than that up to a slim chance of getting much better.  But double-six counted as zero…bummer.   The revised version had double six count as a +5 and a special result.  It barely changed the expected value, but had a big impact on the excitement that players got from rolling.

The Necessity of Random Encounters in D&D

The author goes on to list what he sees as the advantages and disadvantages of random encounters, but quite remarkably to my mind never actually mentions the real purposes of random encounters in terms of setting and game design.  So he lists Pros as being things like killing off annoying characters or filling time, and the Cons as serving no story purpose or throwing the wealth per level guidelines out of whack (and I need to rant about that some day).   No mention at all is made of anything relating to the setting, or verisimilitude, or even resource management.

The post seems to ignore the two most important features of random encounters: naturalism, and husbanding resources.  They’re the GM’s chief tool in presenting the setting as a world that actually contains stuff that isn’t there for the sole purpose of being part of the PCs story, and they are the game system’s primary reason the players can’t completely optimize their resources (particularly daily powers in D&D)–the chance of such an encounter is why players have to keep something in reserve.

(I’d like to get a bit of definition out of the way: by random encounters I mean any encounter that isn’t determined by story needs or the PCs’ direct actions.  It doesn’t necessarily literally have to have come about by rolling dice on a table, though that’s certainly an option, but it’s something that isn’t required as a plot-point of the story or because the PCs have decided to seek out, say, the chief of the palace guard and have an encounter with him.)

Naturalism is important, in my opinion, even if you’re running a story-oriented sort of game.  If the setting contains no features at all that aren’t independent of the needs of the story, then the world will lack all verisimilitude and feel flat and lifeless…if it doesn’t degenerate into parody.  The central joke of Knights of the Dinner Table, after all, is that the GM is stuck with three players out of four who refuse to see the world as containing any features that aren’t clues, prizes, antagonists or (rarely) allies.  If there’s a cow, it must be a magic cow and they capture it and drag it along; if there’s a gazebo it’s a hostile encounter.  But if you don’t have random encounters, then the players will be absolutely right in assuming that if the GM bothers to mention it, it must be significant.  The world will lack any depth.  This, btw, is the curse of many of the graphically intensive computer RPGs…players correctly assume that if something can be interacted with on-screen it must be significant, because the programming and art resources won’t be wasted on mere flavor.  But if the setting contains random encounters, and the players are aware of it, they are thrust in the much more realistic position of no longer knowing whether something they run into is there by chance or design.  They have to reason about the logic within the gameworld instead of logic about the story, which I think is not only much more satisfying, but makes for better stories.  If the players can correctly reason that the vizier is secretly the bad-guy, because viziers are always the bad-guys and besides, he has a goatee, the resulting story only works as a comedy.

While naturalism is valuable for pretty much any kind of system, resource management is peculiar to certain kinds of systems and settings…but is a particularly important part of D&D and its progeny.  If you have a system where resources are defined in terms of their availability per day, per encounter, etc, and are replenished by rest (rather than, say, going back to the store and buying more ammo) then it’s an essential part of the design that the players have to consider whether they’re likely to have to call on those resources at times not of their own choosing.  The random encounter is what balances the X times a day abilities against those that can be used continuously (such as swinging a sword).  If you take it away, either you have to add time pressure to every scenario (which can be quite a strain on verisimilitude) or you have to ramp everything up (or scale the resources back) to match the assumption that the party will always have its full resources and be willing to expend them all.

You could think that it needn’t truly be random, and that as GM you can just devise the encounters just so to make the party expend resources at precisely the right pace, but IMO you’d be wrong.  You’d be wrong because the players aren’t stupid, and they know the game, and they know that as GM you have infinite resources to throw against them, so they will reason that if you hit them with something when they’re particularly low on resources it’s because you’ve chosen to be unfair.  Which is true.  Without randomness whatever you do to them you’ve explicitly chosen to do.  But that means unless you’re willing to be a jerk and kill them just because you can (and good luck getting people to play with you once you’ve established that reputation), you had better not hit them with anything challenging when they’re low on resources–unless you’re also willing to cheat like mad so they come out on top despite it.  But if they know you won’t do that then they’ll be all the more likely to spend all the resources they’ve got and then turtle.

Openly and publicly using random encounters is the solution to that whole set of problems.  If they know that there’s a certain chance of random encounters per period depending on the environs, and some of them might be hostile, then it’s up to them to decide whether to hold something in reserve or chance it–and whether hunkering down in place to recover resources is worth the risk or even possible.  The GM doesn’t have to decide to punish or not punish them for recklessness or over-caution…the setting has certain known features and the players can roleplay whether and how much risk they want to take given the stakes and circumstances.

If you’re really considering whether you will eliminate random encounters in your game, what you really need to think about is how you intend to convey the texture of the setting and not give sense that the PCs are locked in The Matrix where everything is just an illusion for their benefit, and in a D&D-like game how you’re going to deal with the players wanting to blow all their resources in each encounter and then to sit around and recover them for the next encounter.  Random Encounters aren’t the only way to deal with either, but I think they’re one of the simplest and best approaches I’ve seen.