Is player creativity desirable in games?
There's a subset of developers who seem to think so. They like the idea that players should be able to express behaviors and create objects in a gameworld that they (the developers) never thought of.
But these appear to be a distinct minority. Most games are deliberately designed and developed to prevent any truly creative play. In particular, the number of in-game effects that characters and objects can demonstrate are cut back as much as possible.
Why take such pains? Why are most developers so determined to strictly limit player verbs or possible system interactions if player creativity is such a great thing?
There are several not entirely bad reasons why. Unfortunately for the game industry, I believe the combination of these justifications winds up leading to a severe majority of games that are so tightly controlled as to nearly play themselves.
OFFENSIVE CONTENT
One problem with allowing player creativity is rude content.
If you let players do things that modify the gameworld, particularly if they can interact with other players in any way, they are guaranteed to spell out naughty words, erect enormous genitalia, and build penisauruses. (Google "Sporn" for NSFW examples of how gamers immediately used Spore's creativity tools.)
Developers can accept this if they're OK with a mature rating for their game, but creativity tools make it tough to sell a multiplayer game that's kid-safe.
ANYTHING UNPLANNED IS A BUG
Another problem is that emergent behaviors can look to some gamers like bugs.
That doesn't mean they are actual bugs, defined for games as behavior that opposes the intended play experience. Just because it was unintended doesn't mean it opposes the desired play experience.
The developers of Dishonored, for example, were surprised to see their playtesters possess a target while plummeting from a great height, thus avoiding deceleration trauma. It wasn't intended -- it emerged from the interaction of systems -- but it made sense within the play experience Arkane had in mind. So it wasn't a bug, it was a feature... and it got to stay in the game. That appears to be a rare exception to standard practice, though.
NO CRAFT IN CRAFTING
Crafting in MMORPGs is not creative. Crafting -- making objects -- in MMORPGs has nothing to do with "craft" or being "crafty"; it's about mass-producing widgets to win economic competition play. That's a perfect valid kind of play. But it isn't creative.
An argument might be made that some creativity is needed to sell a lot of stuff. But that's not related to crafting as a process of imagining new kinds of objects that meet specific purposes and elegantly bringing them into existence within a gameworld. That's "craftsmanship," and it's what a crafting system worthy of the name would be... but that's not what crafting in MMORPGs ever actually is.
A truly creative crafting system would allow the internal economy of a gameworld to grow through the invention of new IP. Wouldn't that be an interesting way to counter mudflation?
To be fair, a creative crafting system would probably far outshine the rest of most MMORPGs. Part of the crafting system in the late Star Wars Galaxies (SWG) MMORPG was highly regarded, but in an odd way it was so much fun that it didn't ever really fit into a Star Wars game.
So what might a MMORPG (i.e., not Second Life) with a truly creativity-encouraging crafting system look like? In what kind of gameworld would the ability for players to imagine and implement entirely new kinds of things be appropriate?
CLASSES VERSUS SKILLS
Yet another reason to deprecate player creativity is game balance. Especially in multiplayer games, developers not unreasonably want to try to keep the playing field level for players using (marginally) different playstyles.
A common way this gets expressed is by organizing character skills in level-controlled classes. It's more interesting to key character abilities to skills, and let players pick and choose the skills they want. But this (developers have decided) allows the emergence of character ability combinations that may be either unexpectedly "overpowered" or too "weak" to compete effectively with players of similar skill levels.
This perspective that "interacting systems allow emergent effects that interfere with the intended play experience and therefore must be minimized" explains (as one example) why Sony Online Entertainment completely deleted the extensive individual skills system of the original Star Wars Galaxies and replaced it with a few static classes with specific abilities at developer-determined levels, just like pretty much every other MMORPG out there.
The New Gameplay Experience was well-regarded by some of SWG's new players. But many long-time players felt that the original SWG's unique skills-based ability model was much more creatively satisfying. When it was changed so radically to a class-based model, eliminating their ability to express themselves in a detailed way through their character's abilities, they left the game.
EVE Online also allows skill selection, but in practice most people wind up with the same skills. So is it possible any longer to offer a major MMORPG that encodes player abilities in mix-and-match skills, rather than a small set of classes in which my Level 80 Rogue is functionally identical to your Level 80 Rogue?
CODING TO THE TEST CASES
One more reason why emergence gets locked down in games starts, ironically, with sensibly trying to use more mature software development practices.
Test case driven software development is the process of documenting what your code is supposed to do through well-defined requirements, then writing test cases that describe how to find out whether the software you actually write meets those requirements.
That's often a Good Thing. It helps to insure that you deliver will be what your customers are expecting. But there is a dark side to this process, as there can be for any process, which is that if your organization starts getting top-heavy, with a lot of layers between the people running things and those doing the actual game development, the process eventually tends to become the deliverable. Reality becomes whatever the process says it is. Process is easier to measure than the meaning of some development action: "How many lines of code did you write today?"
The practical result of enforcing the "everything must have a test case" process is that every feature must have a test case. That's actually pretty handy for testing to a well-defined set of expectations.
Unfortunately, the all-too-common corollary is: if we didn't write a test case for it, you're not allowed to have that feature. At that point, the process has become your deliverable, and your game is very unlikely to tolerate any creativity from its players. It might be a good game by some standard. But it probably won't be memorable.
Still, a process for reliably catching real bugs is valuable. So how can the desire to allow some creativity and the need to deliver measureably high quality coexist?
EPIC STORY MUST BE TOLD AS-IS
Finally, there is the problem of the Epic Story.
Emergent gameplay invites exploratory creativty. But broadly emergent gameplay interferes with a carefully-crafted narrative. The more epic and detailed the story -- which translates to more development money spent on that content -- the less freedom you can permit players to go do their own wacky things, because then they might not see that expensive content. The Witcher 2 fought this somewhat, but it's emphatically the exception.
Is there a middle ground between developer story and player freedom? Or is there a way to design a game so that both of these can be expressed strongly?
To sum up: from the perspective of many game developers, especially in the AAA realm, "emergent" automatically equals "bug" in all cases. A mindset that only the developers know how the game is meant to be played, rather than a respect for what players themselves enjoy doing, is leading many developers to design against creativity. The idea of of actually increasing the number of systems or permitted system interactions seems to be something that just will not be permitted.
The result is that player creativity in these games is so constrained as to be nonexistent. You're just mashing buttons until you solve each challenge, in proper order, in the one way the developers intended.
Is there any sign that this might be changing, perhaps as the success of some indie games demonstrates that there is a real desire for games that encourage player creativity?
Saturday, November 24, 2012
Monday, November 12, 2012
Plausibility Versus Realism
Every now and then, a forward-thinking, open, courteous, kind, thrifty and generally very attractive group of developers will choose to let gamers see a little of their design thinking for a game in development.
For roleplaying games such as the recently very well Kickstarted Project Eternity by Obsidian Entertainment, the conversation can be informed and thoughtful. Not all of the ideas suggested by enthusiastic armchair designers will be right for a particular game, but the level of discussion is frequently very high.
However, in the years since I've observed such forums, there is inevitably a conversational glitch that appears. It doesn't take long before even very knowledgeable commenters will begin to argue in favor of gameplay systems that replicate real-world physical effects.
THE DESERT OF THE REAL
They might be asking for armor that has weight and thus restricts physically weak characters from equipping it at all. Or maybe it's for weapons that require specialized training, so that a character must have obtained a particular skill to use certain weapons. Sometimes there's a request for a detailed list of damage types, or for complex formulas for calculating damage amounts, or that environmental conditions like rain or snow should reduce movement rates or make it harder to hit opponents.
What all these and similar design ideas share (other than an enthusiasm for functionally rich environments) is an unspoken assumption that the RPG in question needs more realism.
Later on I'll go into where I think this assumption comes from. For now, I'd like to consider why I think there's a better approach when trying to contribute to a game's design -- instead of realism, the better metric is plausibility.
PLAUSIBILILITY DEFINED
The difference between realism and plausibility is a little subtle, but it's not just semantic. Realism is about selecting certain physical aspects of our real world and simulating them within the constructed reality of the game world; plausibility is about designing systems that specifically feel appropriate to that particular game world. Plausibility is better than realism in designing a game with a created world -- what Tolkien called a "secondary reality" -- because realism crosses the boundary of the magic circle separating the real world from the logic of the constructed world while plausible features are 100% contained within the lore of the created world.
To put it another way, plausibility is a better goal than realism because designing a game-complete set of internally consistent systems delivers more fun than achieving limited consistency with real-world qualities and processes. Making this distinction is crucial when it comes to designing actual game systems. Every plausible feature makes the invented world better; the same isn't true of all realistic features imported into the gameworld. Being realistic doesn't necessarily improve the world of the game.
Despite this, a design idea based on realism often sounds reasonable at first. We're accustomed to objects having physical properties like size and weight, for example, as well as dynamic properties such as destructibility and combustibility. So when objects are to be implemented in a game world, it's natural to assume that those objects should be implemented to express those physical characteristics.
But there are practical and creative reasons not to make that assumption.
WHY NOT REALISM?
Creating simulations of physical systems -- which is the goal of realism -- requires money and time to get those systems substantially right relative to how the real world works. Not only must the specific system be researched, designed and tested, but all the combinations of simulated systems that interface with each other must be tested -- all the emergent behaviors have to seem realistic, too.
Trying to meet a standard of operation substantially similar to something that exists outside the imaginary world of the game is just harder than creating a system that only needs to be consistent with other internal game systems.
Maybe worst of all, there are so many real-world physical processes that it's impossible to mimic even a fraction of them in a game. Something will always be left out. And the more you've worked to faithfully model many processes, the more obvious it will be that some process is "missing." This will therefore be the very first thing that any reviewer or player notices. "This game claims to be realistic, but I was able to mix oil and water. Broken! Unplayable! False advertising!"
This isn't to say that all simulation of physical processes is hard/expensive, or that there can never be a good justification for including certain processes (gravity, for example). Depending on your game, it might be justifiable to license a physics simulation library such as Havok.
But the value of implementing any feature always has to be assessed by comparing the likely cost to the prospective benefits. For many games (especially those being made on a tight budget), realism should almost always be secondary to plausibility because realism costs more without necessarily delivering more fun for more players.
SOME GAMES ARE MORE REAL THAN OTHERS
Knowing when to apply either realism or plausibility as the design standard depends on understanding what kind of game you're making. If it's core to your gameplay, such as throwing objects in a 3D space, then the value of simulating some realistic effects like gravity increases because the benefits of those effects are high for that particular game. Otherwise, you're better off implementing only a plausible solution or excluding that effect entirely.
Let's say you're making a 3D tennis game. The core of tennis is how the ball behaves, and the fun comes from how players are able to affect that behavior. So it makes sense for your game design to emphasize in a reasonably realistic way the motion of a ball in an Earth-normal gravity field (a parabola), as well as how the angle and speed of a racquet's elastic collision with a ball alters the ball's movement. If it's meant to be a serious sports simulation, you might also choose to model some carefully selected material properties (clay court versus grass) and weather conditions (humidity, rain).
But you probably wouldn't want to try to simulate things like the Earth's curvature, or solar radiation, or spectator psychology. They don't have enough impact on the core fun to justify the cost to simulate them. And for a simple social console game, ball movement and racquet impact are probably the only things that need limited realism. The better standard for everything else that has to be implemented as the world of the game is plausibility. If it doesn't help the overall game feel logically and emotionally real in and of itself, then it doesn't meet the standard and probably should not be implemented.
A ROLEPLAYING GAME NEEDS ITS OWN REALITY
This is even more true for a character-based game, which requires a world in which those characters have a history and relationships and actions they can take with consequences that make sense. In designing a 3D computer role-playing game, the urge to add realistic qualities and processes to the world of that game can be very hard to resist.
Action-oriented Achiever gamers are usually OK with abstracted systems; the fun for them comes in winning through following the rules, whatever those rules might be. But for some gamers -- I'm looking at you, Idealist/Narrativists and my fellow Rational/Explorers -- the emotional meanings and logical patterns of those rules matter a great deal.
Characters who behave like real people, and dynamic world-systems that behave like real-world systems, make the game more fun for us. We find pleasure in treating the gameworld like it's a real place inhabited by real people, not just a collection of arbitrary rules to be beaten. For us, games (and 3D worlds with NPCs in particular) are more fun when they offer believable Thinking and Feeling content in addition to Doing and Having content.
Providing that kind of content for Narrativist gamers is non-trivial. "Realistic" NPC AI is hard because, as humans, we know intimately what sensible behavior looks like. So while there are often calls for NPCs to seem "smarter," meaning that they're more emotionally or logically realistic as people, it's tough for game developers to sell to publishers the value of the time (i.e., money) that would be required to add realistic AI as another feature. Developers of RPGs usually have a lot of systems to design and code and test. So working on AI systems that give NPCs the appearance of a realistic level of behavioral depth, in addition to all the other features, is very hard to justify. (The Storybricks technology is intended to help with building more plausible NPCs. But that's the whole point of that technology.)
Another argument against realism in a gameworld is complexity. Most developers prefer to build simple, highly constrained, testable cause/effect functions rather than the kinds of complex and interacting systems that can produce the kinds of surprising emergent behaviors found in the real world. Explorers find that hard to accept. Explorers aren't just mappers of physical terrain; they are discoverers of dynamic systems -- they love studying and tinkering with moving parts, all interacting as part of a logically coherent whole.
Explorers also tend to know a little about a lot of such systems in the real world, from geologic weathering to macroeconomics to the history of arms and armor and beyond. So it's natural for them to want to apply that knowledge of real systems to game systems. Since you're building a world anyway (they reason), you might as well just add this one little dynamic system that will make the game feel totally right.
Now multiply that by a hundred, or a thousand. And then make all those systems play nicely together, and cohere to support the core vision for the intended play experience. "Hard to do well" is an understatement.
A CREATED WORLD DOESN'T NEED OUR WORLD'S REALISM
That's the practical reason why, for most systems in a worldy game, plausibility will usually be the better standard. If the game is meant to emphasize melee combat, for example, then having specific damage types caused by particular weapons and mitigated by certain armors might sound good. Moderate simulation of damage delivery and effects might be justifiable. Those action features will, if they're paced well and reward active play, satisfy most gamers.
But a role-playing game must emphasize character relationships in an invented society, where personal interactions with characters and and the lore of the world are core gameplay and combat is just one form of "interaction" among several. For that kind of game, the better choice is probably to limit the design of that game's combat system to what feels plausible -- get hit, lose some health -- and to abstract away the details.
Plausible systems are especially desirable in roleplaying games because they meet the needs of Explorers and Narrativists. Intellectually and emotionally plausible elements of the game feel right. They satisfy our expectations for how things and people should appear to behave in that created world.
A plausible combat system deals and mitigates damage; it can but doesn't need to distinguish between damage types. A plausible "weather" system has day/night cycles and maybe localized rain; it doesn't require the modeling of cold fronts and ocean temperatures and terrain elevation. A plausible economy has faucets and drains, and prices generally determined by supply and demand; it doesn't have to be a closed system. Plausibility insures that every feature fits the world of the game without doing more than is necessary.
This is the best of both worlds. Making plausibility the standard gives players lots of different kinds of things to do, and it makes the implementation of those systems simple enough for them to be worth implementing and feasible to implement by mere mortals.
CONCLUSION
So, to gamers enthusiastic about adding realistic interactive capabilities to a brand-new gameworld, I say: whoa, there! Before you ask the designers to add this one additional little thing that would help make the game more "realistic," stop and think about that idea from a game designer's perspective.
Developers can't add every great idea, even if they're still in the design phase. But the chances of seeing some feature suggestion implemented improve if you can explain how it makes the unique world of that game feel more plausible at minimal development cost.
For roleplaying games such as the recently very well Kickstarted Project Eternity by Obsidian Entertainment, the conversation can be informed and thoughtful. Not all of the ideas suggested by enthusiastic armchair designers will be right for a particular game, but the level of discussion is frequently very high.
However, in the years since I've observed such forums, there is inevitably a conversational glitch that appears. It doesn't take long before even very knowledgeable commenters will begin to argue in favor of gameplay systems that replicate real-world physical effects.
THE DESERT OF THE REAL
They might be asking for armor that has weight and thus restricts physically weak characters from equipping it at all. Or maybe it's for weapons that require specialized training, so that a character must have obtained a particular skill to use certain weapons. Sometimes there's a request for a detailed list of damage types, or for complex formulas for calculating damage amounts, or that environmental conditions like rain or snow should reduce movement rates or make it harder to hit opponents.
What all these and similar design ideas share (other than an enthusiasm for functionally rich environments) is an unspoken assumption that the RPG in question needs more realism.
Later on I'll go into where I think this assumption comes from. For now, I'd like to consider why I think there's a better approach when trying to contribute to a game's design -- instead of realism, the better metric is plausibility.
PLAUSIBILILITY DEFINED
The difference between realism and plausibility is a little subtle, but it's not just semantic. Realism is about selecting certain physical aspects of our real world and simulating them within the constructed reality of the game world; plausibility is about designing systems that specifically feel appropriate to that particular game world. Plausibility is better than realism in designing a game with a created world -- what Tolkien called a "secondary reality" -- because realism crosses the boundary of the magic circle separating the real world from the logic of the constructed world while plausible features are 100% contained within the lore of the created world.
To put it another way, plausibility is a better goal than realism because designing a game-complete set of internally consistent systems delivers more fun than achieving limited consistency with real-world qualities and processes. Making this distinction is crucial when it comes to designing actual game systems. Every plausible feature makes the invented world better; the same isn't true of all realistic features imported into the gameworld. Being realistic doesn't necessarily improve the world of the game.
Despite this, a design idea based on realism often sounds reasonable at first. We're accustomed to objects having physical properties like size and weight, for example, as well as dynamic properties such as destructibility and combustibility. So when objects are to be implemented in a game world, it's natural to assume that those objects should be implemented to express those physical characteristics.
But there are practical and creative reasons not to make that assumption.
WHY NOT REALISM?
Creating simulations of physical systems -- which is the goal of realism -- requires money and time to get those systems substantially right relative to how the real world works. Not only must the specific system be researched, designed and tested, but all the combinations of simulated systems that interface with each other must be tested -- all the emergent behaviors have to seem realistic, too.
Trying to meet a standard of operation substantially similar to something that exists outside the imaginary world of the game is just harder than creating a system that only needs to be consistent with other internal game systems.
Maybe worst of all, there are so many real-world physical processes that it's impossible to mimic even a fraction of them in a game. Something will always be left out. And the more you've worked to faithfully model many processes, the more obvious it will be that some process is "missing." This will therefore be the very first thing that any reviewer or player notices. "This game claims to be realistic, but I was able to mix oil and water. Broken! Unplayable! False advertising!"
This isn't to say that all simulation of physical processes is hard/expensive, or that there can never be a good justification for including certain processes (gravity, for example). Depending on your game, it might be justifiable to license a physics simulation library such as Havok.
But the value of implementing any feature always has to be assessed by comparing the likely cost to the prospective benefits. For many games (especially those being made on a tight budget), realism should almost always be secondary to plausibility because realism costs more without necessarily delivering more fun for more players.
SOME GAMES ARE MORE REAL THAN OTHERS
Knowing when to apply either realism or plausibility as the design standard depends on understanding what kind of game you're making. If it's core to your gameplay, such as throwing objects in a 3D space, then the value of simulating some realistic effects like gravity increases because the benefits of those effects are high for that particular game. Otherwise, you're better off implementing only a plausible solution or excluding that effect entirely.
Let's say you're making a 3D tennis game. The core of tennis is how the ball behaves, and the fun comes from how players are able to affect that behavior. So it makes sense for your game design to emphasize in a reasonably realistic way the motion of a ball in an Earth-normal gravity field (a parabola), as well as how the angle and speed of a racquet's elastic collision with a ball alters the ball's movement. If it's meant to be a serious sports simulation, you might also choose to model some carefully selected material properties (clay court versus grass) and weather conditions (humidity, rain).
But you probably wouldn't want to try to simulate things like the Earth's curvature, or solar radiation, or spectator psychology. They don't have enough impact on the core fun to justify the cost to simulate them. And for a simple social console game, ball movement and racquet impact are probably the only things that need limited realism. The better standard for everything else that has to be implemented as the world of the game is plausibility. If it doesn't help the overall game feel logically and emotionally real in and of itself, then it doesn't meet the standard and probably should not be implemented.
A ROLEPLAYING GAME NEEDS ITS OWN REALITY
This is even more true for a character-based game, which requires a world in which those characters have a history and relationships and actions they can take with consequences that make sense. In designing a 3D computer role-playing game, the urge to add realistic qualities and processes to the world of that game can be very hard to resist.
Action-oriented Achiever gamers are usually OK with abstracted systems; the fun for them comes in winning through following the rules, whatever those rules might be. But for some gamers -- I'm looking at you, Idealist/Narrativists and my fellow Rational/Explorers -- the emotional meanings and logical patterns of those rules matter a great deal.
Characters who behave like real people, and dynamic world-systems that behave like real-world systems, make the game more fun for us. We find pleasure in treating the gameworld like it's a real place inhabited by real people, not just a collection of arbitrary rules to be beaten. For us, games (and 3D worlds with NPCs in particular) are more fun when they offer believable Thinking and Feeling content in addition to Doing and Having content.
Providing that kind of content for Narrativist gamers is non-trivial. "Realistic" NPC AI is hard because, as humans, we know intimately what sensible behavior looks like. So while there are often calls for NPCs to seem "smarter," meaning that they're more emotionally or logically realistic as people, it's tough for game developers to sell to publishers the value of the time (i.e., money) that would be required to add realistic AI as another feature. Developers of RPGs usually have a lot of systems to design and code and test. So working on AI systems that give NPCs the appearance of a realistic level of behavioral depth, in addition to all the other features, is very hard to justify. (The Storybricks technology is intended to help with building more plausible NPCs. But that's the whole point of that technology.)
Another argument against realism in a gameworld is complexity. Most developers prefer to build simple, highly constrained, testable cause/effect functions rather than the kinds of complex and interacting systems that can produce the kinds of surprising emergent behaviors found in the real world. Explorers find that hard to accept. Explorers aren't just mappers of physical terrain; they are discoverers of dynamic systems -- they love studying and tinkering with moving parts, all interacting as part of a logically coherent whole.
Explorers also tend to know a little about a lot of such systems in the real world, from geologic weathering to macroeconomics to the history of arms and armor and beyond. So it's natural for them to want to apply that knowledge of real systems to game systems. Since you're building a world anyway (they reason), you might as well just add this one little dynamic system that will make the game feel totally right.
Now multiply that by a hundred, or a thousand. And then make all those systems play nicely together, and cohere to support the core vision for the intended play experience. "Hard to do well" is an understatement.
A CREATED WORLD DOESN'T NEED OUR WORLD'S REALISM
That's the practical reason why, for most systems in a worldy game, plausibility will usually be the better standard. If the game is meant to emphasize melee combat, for example, then having specific damage types caused by particular weapons and mitigated by certain armors might sound good. Moderate simulation of damage delivery and effects might be justifiable. Those action features will, if they're paced well and reward active play, satisfy most gamers.
But a role-playing game must emphasize character relationships in an invented society, where personal interactions with characters and and the lore of the world are core gameplay and combat is just one form of "interaction" among several. For that kind of game, the better choice is probably to limit the design of that game's combat system to what feels plausible -- get hit, lose some health -- and to abstract away the details.
Plausible systems are especially desirable in roleplaying games because they meet the needs of Explorers and Narrativists. Intellectually and emotionally plausible elements of the game feel right. They satisfy our expectations for how things and people should appear to behave in that created world.
A plausible combat system deals and mitigates damage; it can but doesn't need to distinguish between damage types. A plausible "weather" system has day/night cycles and maybe localized rain; it doesn't require the modeling of cold fronts and ocean temperatures and terrain elevation. A plausible economy has faucets and drains, and prices generally determined by supply and demand; it doesn't have to be a closed system. Plausibility insures that every feature fits the world of the game without doing more than is necessary.
This is the best of both worlds. Making plausibility the standard gives players lots of different kinds of things to do, and it makes the implementation of those systems simple enough for them to be worth implementing and feasible to implement by mere mortals.
CONCLUSION
So, to gamers enthusiastic about adding realistic interactive capabilities to a brand-new gameworld, I say: whoa, there! Before you ask the designers to add this one additional little thing that would help make the game more "realistic," stop and think about that idea from a game designer's perspective.
Developers can't add every great idea, even if they're still in the design phase. But the chances of seeing some feature suggestion implemented improve if you can explain how it makes the unique world of that game feel more plausible at minimal development cost.
Subscribe to:
Posts (Atom)