Our work-in-progress, Exploit: Zero Day, will be a social, online game. It'll let you roleplay as a hacker with other players and even interact with the game's characters on the forums. The kind of community that forms will be essential to the game's success.
At the same time, the Internet can be an abusive place, especially in gaming spheres, and we're producing a game themed around "hacking" and computer security intrusion that is attempting to provide compelling dilemmas. There's a definite risk of our community becoming hostile, toxic, or downright dangerous.
Part of our vision statement at Future Proof Games is "audacious compassion." How do we create a community that encourages audacious compassion while being fun and financially self-sustaining? We're still working out the details, but here's our thoughts at the moment.
In meatspace there exist intentional communities, such as communes, consisting of groups of people living together with some purposes and principles in common. This is what online communities are, with varying degrees of intention. The community administrators (in this case, us) invite others to join in discussion and conversation. By establishing standards of behavior, the administrators and prominent users shape the mood and attitude of the community. It's important to realize that these standards can be explicit or implicit.
Codes of Conduct
There's been a bit of furor over the past couple of years regarding professional conference codes of conduct, with some folks questioning whether they are necessary. Codes of conduct are explicit standards of behavior. They say, "here is what is and is not acceptable" and, if they're any good, "here is what happens if you do something unacceptable." Without such a standard, the expectations and consequences of behavior are unclear.
The fact that these lists are so familiar and so similar is a red flag to me. Thousands of forums have one of these sets of rules and few of them are enforced. The terms are vague and I doubt many users give them much weight. Compare this to the open source ConfCodeOfConduct, which makes it clear what harassment is in the context of a conference and gives examples.
One essential step, then, is to establish an explicit and enforceable code of conduct that makes it clear what behavior is unacceptable.
Setting a Tone
Implicit standards are those that aren't written out but are implied to or can be understood by visitors. These are necessarily vaguer; they guide behavior rather than putting boundaries on it. For our forums, we'll be using Discourse, a system with the stated goal of enabling civilized discourse. While their proposed rules don't give examples and fall afoul of the "don't feed the trolls" fallacy, the system itself has some very useful features.
Beyond being a slick and modern forum app, Discourse has the concept of trust. Users gain "trust levels" by participating in the forum, opening up new abilities on the forum. If the community starts with an appropriate tone, this trust reinforces that tone by rewarding active participants.
This only works if you start with the right tone, and that means being picky about who your early contributors are and, more importantly, making sure that the administrators follow their own rules. The admins need to show as well as tell in order to establish the expected tone of the board. There's a bunch more options for encouraging implicit standards, but the basics of establishing a tone and rewarding participation in that environment is a good start.
League of Legends, from Riot Games, was once known as having one of the most toxic social environments of any online community. They're also tremendously popular, which both made that problem worse and allowed them to afford hiring dedicated experts to address it. In a GDC talk, their Lead Social Systems Designer explored how they went about it. It wasn't enough to just ban players; they had to make sure that players understood why they were banned and reflect their behavior back at them.
Riot has done an assortment of other things as well: empowered players to report other players, made a public Tribunal system to clarify consequences, created a positive Honor system to reward friendly players, and took advantage of priming to subtly manipulate players away from toxic behaviors.
Making community standards mean something is vitally important. It's not enough to just say certain behavior is bad. The admins need to act to discourage toxic behavior, making it clear what they are doing and why. Likewise, behavior that is in line with the community standards should be recognized and rewarded in a concrete way.
This is true even when money is involved. It's worth it to lose one toxic customer, even a big one, in order to maintain a community. I have mixed feelings about taking their money and then denying them access to the individual or anonymous aspects of the game, but it is a good thing to ban an egregious offender from the social community no matter how much they've paid.
So we here at Future Proof need to do several things to create a positive environment that is in line with our value of audacious compassion. We have to:
- Establish a clear code of conduct with specific examples.
- Set an initial tone that matches how we want the community to behave.
- Encourage people to follow that tone by rewarding active contributors.
- Take clear and concrete action to discourage toxic behavior, even from paying customers.
- Recognize and reward positive behavior.
What do you think? Are there other ways in which we can encourage a nurturing online community? Is there anything we should avoid? Please leave your thoughts in the comments.