Tuesday, October 30, 2012

They Also Serve Who Only Stand and Wait

"[T]housands at his bidding speed,
And post o'er land and ocean without rest;
They also serve who only stand and wait."
-- John Milton, "On His Blindness"

There's a belief I've seen expressed by a number of developers that can be paraphrased as: "All players want to be The Hero. Every gamer expects to be the all-powerful savior, prime mover of all action, star of the show, center of all attention. Therefore every game that personifies the player as a character in a world must to be designed to allow the player to be the hero. And that goes for multiplayer games, too."

Not to pick on Emily Short, who is a respected creator of Interactive Fiction games, but an example of this perspective can be found in a talk she gave at GDC Online in 2012, as summarized on Gamasutra: Making Everyone Feel Like a Star in a Multiplayer Game. As Gamasutra's Frank Cifaldi summarized it: "Even in a multiplayer game, every player has to feel as if they are playing out their own personal, unique story. They cannot feel as if they are in a supporting role, or their investment in the narrative will fall apart."

It's a nice theory. Is it true?

I'm Ready For My Close-Up

For some gamers, it is true. They do want to be the hero, and they do expect any and every personification game (where you play as a character) to cater to that desire. Character-based games, they feel, are essentially power fantasies where the world is supposed to revolve around them. Although they are unlikely to say it this way, these gamers expect every feature in a game to be about letting them express dominance, either "physical" (as through combat in a three-dimensional game world) or emotional (as in much interactive fiction).

Even if a desire to follow some version of Campbell's "Hero's Journey" isn't baked into people, gamers today have grown up with games in which you are the hero who saves the world. So many games follow this pattern that it would be surprising if many gamers have not come to expect it as a natural, even required, element of all personification games.

The problem is that this expectation of epic centrality is demonstrably not true for all gamers. Despite the pattern, not every gamer wants to be the hero. There's evidence of a meaningful minority of gamers who are happiest when a game gives them ways to help other players succeed. These gamers truly do not want to be the star -- they prefer a supporting role.

The Cleric as Un-Hero

Consider the four archetypal classes: warrior, wizard, rogue, cleric. Warriors deal mêlée damage and are pack mules; wizards cast ranged spells and know lots of lore; rogues backstab, detect traps and steal shiny things. All of these are heroic in their own way -- their gameplay content is about acting for themselves. The actions the game is designed to allow them to take are all focused on their own self-enhancement.

But clerics, while they can sometimes do divinely-inspired damage, are mostly about healing the wounds or diseases of other characters, along with protecting ("buffing") other characters. That's been the traditional functional definition of the cleric role in roleplaying games dating back to Dungeons & Dragons (and probably before that). A modern addition is some form of "crowd-control" feature, but the function is the same: providing support to the actively heroic characters.

That style of play is not about indulging power fantasies. The game actions that a cleric is built to perform aren't centered on the person playing the character, but on other players. So why are cleric roles implemented in games at all? Why do developers even bother implementing a cleric role if character-based games are supposed to be all about letting the player feel like a hero?

The World Needs a Healer

One explanation is that the healer role is included simply as a matter of utility. If a game is pretty much all about killing (as most computer games are), then to make it interesting there needs to be some risk of being injured yourself. If there's no way to heal your own injuries, then you need someone else to do the healing. And in a typical fantasy setting, that character is the cleric. In a modern setting, this role is often called a "medic," as in Valve's Team Fortress 2 multiplayer game, but it's pretty much the same other-focused functionality.

But of course it's not a hard design requirement in any constructed computer game to have some other person heal your character. It's simple enough to provide the healing function through potions or stimpaks that magically undo character damage. And yet game developers keep implementing character class roles whose abilities are focused on helping other characters.

Perhaps developers do this because enough gamers like playing clerics to justify moving those abilities to a separate class role. But that begs the question: if a roleplaying game offers a role whose primary function is to support other players, why are there so many gamers who are happy to fill that role? If everyone really expects to be the star, who are all these people looking to be part of the supporting cast?

The Craft of Helping

In fact, clerics aren't the only source of supportive abilities in roleplaying games, particularly in the massively multiplayer online variety (MMORPGs). A popular alternative activity in these games is "crafting," which involves creating objects that are usable and useful inside the game world.

Although there is pleasure in the crafting of new things (though, from my Explorer perspective, that reason for crafting is almost never emphasized), most crafting in MMORPGs is there to provide useful objects for other players. Often these are specific to combat gameplay -- weapons or ammunition -- but crafting can also be defined as a source of tools such as fishing poles or resource detectors.

Either way, in a game where usable objects can be looted from defeated enemies, implementing crafting gameplay insures that combat players don't have to hope and wait for certain items to drop as loot. Crafting also allows some players to serve a useful role in a game without forcing them to participate in direct combat gameplay. This allows more people to play the game (and pay for the privilege) than would have been the case in a combat-only game.

One of the best-known descriptions of this playstyle preference is the article posted to Stratics in 2001 by Lloyd Sommerer (as "Sie Ming"): I Want to Bake Bread. In this plea for game developer understanding, Lloyd ably points out the kinds of supportive behaviors that some gamers enjoy providing, and wonders why developers don't seem interested in the benefits a game can obtain from including features that attract gamers like these.

It's still a good question.

Supportive Play is Good Gameplay

To sum up: some people enjoy helping other people, but few games reward that playstyle. That's a missed opportunity, both in terms of revenue and of including people in your gaming community who are genuinely helpful. If they don't play the hero, that's OK, and smart developers will create gameplay for them instead of trying to force them into the hero's boots (which won't work).

The people who come to a computer game wanting to play a healer, or a maker of things, are there specifically because they want to play a character-filled game that does not force them into the spotlight. Being able to play a supporting role satisfies a deeply held need of some people to be of service to others. These helpful souls are not only content to let others have the limelight, they actually prefer it that way. Their pleasure comes from helping others succeed.

That kind of character is in direct contradiction to how pretty much every personification game is designed. Whether you like it or not, you're forced to be the star, to make all the big decisions for yourself and maybe others, too.

But by assuming that every game has to be designed that way, developers are telling many would-be gamers that their playstyle interests aren't wanted. That's a shame both artistically and commercially.

Games don't need to be only about support roles. You could create a game where the player can only be a healer or a crafter, and those might be fun -- but it's not necessary to go that far.

A game that offers the option of rewarding players for being supportive, for helping out NPCs or other players in ways that don't involve saving the world or being put on a stage for it, would be one that more people would find enjoyable. It would be more fun for more people, and would bring more cooperative play to gameworlds that are often harshly contentious.

As game design goals go, that's not a bad one.

Thursday, October 18, 2012

Squeal vs. Squee: How Game Sequels are Received

The recent release of Resident Evil 6 generated a fair bit of discussion regarding how some players of previous installations of the the survival horror franchise are not happy with RE6's new, more action-oriented direction.

That raises an interesting general question. What makes a new game in a series welcomed, tolerated, or reviled by fans of previous entries?

Assuming the later games are of the same or better quality as the first, what makes one game "a welcome update to a series getting stale" and another "a betrayal of everything that fans of this series have come to expect?"

Bearing in mind that love and hate for particular games are often highly subjective responses (to put it politely), I think it's possible to make some broad but useful observations. Here are some suggested categories of gamer sentiment regarding sequels:

  • Wing Commander 1 & 2 => Wing Commander 3 & 4
  • System Shock => System Shock 2
  • Thief => Thief 2: The Metal Age
  • Uncharted: Drake's Fortune => Uncharted 2: Among Thieves
  • TES IV: Oblivion => TES V: Skyrim

  • Deus Ex => Deus Ex: Human Revolution
  • UFO: Enemy Unknown (X-COM) => XCOM: Enemy Unknown by Firaxis

  • TES III: Morrowind => TES IV: Oblivion
  • Deus Ex => Deus Ex: Invisible War
  • System Shock 1/2 => BioShock
  • Mass Effect => Mass Effect 2

  • Fallout 1/2 => Fallout 3
  • UFO: Enemy Unknown (X-COM) => XCOM by 2K Games [tentative]
  • Resident Evil 1-4 => Resident Evil 6

All of these are debatable in their details. I don't agree personally with all of them, and some of them many not even be accurate in an objective sense. But they do, I think, accurately reflect how these games are assessed by gamers generally. So for now, let's assume that you're willing to accept most of the category assignments I've proposed here. Some sequels are loved, some are accepted, and some get mostly bad press.

What do the games in each category have in common with each other? And what sets them apart from the games in the other categories?

One fairly obvious difference is time -- specifically, how much time has passed from one entry in a series to the next. Games that are perceived as improvements on their predecessors tend to be released fairly soon after the prior game, while games made much later tend to be judged more severely. Skepticism probably colors beliefs before a late sequel is released, and nostalgia for a very highly regarded earlier game makes a fair comparison harder for any follow-up. This suggests that it's a good idea to have a design for a fairly similar sequel ready to implement if the initial game in a new franchise takes off.

Slightly less obvious, and related to time, is who makes the follow-up game. A sequel made by the original game's creator (or members of the team that made the original game) is likely to be perceived more positively than a game made by a completely different developer.

(There are exceptions for a few studios. Knights of the Old Republic 2 and Fallout: New Vegas, developed by Obsidian Entertainment, while less appealing to some players of KOTOR and Fallout 3, received higher marks from many gamers. And Eidos's respectful handling of its Deus Ex prequel muted much of the negative discussion of its in-development Thief sequel. At worst, sequel games made by studios that early game fans feel they can trust fall somewhere between positive and mixed reception.)

Changing the display engine or target platform often generates some disapproval. This showed up in particular after 2000 when primary development shifted from the PC to the new generation of game consoles. Deus Ex and The Elder Scrolls are examples of franchises that suffered from this perception; Deus Ex: Invisible War and TES IV: Oblivion were developed first for consoles then ported to the PC platform of their original games, and are frequently given the "dumbed down" criticism by fans of the earlier games.

BioShock, though not a direct sequel to the PC-based System Shock games, also met with some of this criticism, but overcame it by creating a new and strongly-realized setting for the fairly similar game mechanics. BioShock also shows that falling into this category doesn't imply that the later games must be "bad," either artistically or commercially. Gamers lost to the "dumbed down" problem may be replaced by those who gravitate to or grow up using the newer target platforms.

A final factor appears to be whether a sequel makes significant alterations to the primary gameplay mechanics (and often the player visual perspective) associated with a popular franchise. The X-COM and Fallout franchises went through this -- fans of the pausable, tactical third-person format of the earlier games reacted very negatively to the shift to real-time, first-person shooter gameplay of the later games. Fans of Fallout 1/2 can still be found grousing about the change in Fallout 3 despite the later game's evident quality and popularity. Mass Effect 2 was criticized for significantly reducing the number of character skill options from the more RPG-like orginal Mass Effect. And 2K's as-yet-unreleased first-person shooter take on X-COM (which was recently revealed as having been changed to third-person perspective) generated more negative comment than Firaxis's more faithful recreation.

These effects are understandable, and maybe unavoidable. It's impossible for a sequel to perfectly please every gamer who enjoyed the initial game(s) while at the same time changing to attract new players. Gamers as a group are notorious for wanting "the same, only different." If it's too different, you lose the fans who liked the original game. But if it's too similar, you'll be criticized for "charging for the same game twice."

It's also creatively and financially risky to make too many trips to the same well without perking things up somehow -- consumers of any kind of entertainment will eventually tune out. Finally, from a developer's viewpoint it's just less fun to iterate on a well-known formula than to make a new game that stretches some different developer muscles.

Those realities acknowledged, it's also true (as Simon Ludgate recently pointed out) that if you're going to make a game that purports to be a new entry in a popular series, then your new game's design ought to at least include some core elements from the games that made the series popular. This is both a matter of courtesy and business: it does not pay to antagonize the people who are the biggest (and often most vocal) fans of the franchise you're trying to extend.

Finding the balance point between respecting the past while meeting new modern expectations is hard. But the reward for doing it well is gamer trust that translates directly into future sales.

Otherwise, just call it a "spiritual successor"....